I’m a research scientist at Deep Render, working in image and video compression. Previously I was a post-doc at McGill University, in machine learning. My PhD was in applied math, where I was supervised by Adam Oberman. My research interests lie at the intersection of deep learning, machine learning, and applied math.
|Jul 18, 2020||Our work on Normalizing Flows and potential functions was featured as a contributing talk at the INNF workshop at ICML|
|Jul 6, 2020||Started as a research scientist at Deep Render|
|Jun 11, 2020||New preprint on learning Continuous Normalizing Flows as gradients of potential functions, using Optimal Transport & duality|
|Jun 1, 2020||"How to train your neural ODE: ..." was accepted to ICML 2020|
|Apr 23, 2020||Gave a talk at UCLA during the IPAM workshop on PDEs and Inverse Problems in Machine Learning|