Nikola B. Kovachki

California Institute of Technology

MC 305-16

1200 E. California Blvd.

Pasadena, CA 91125

nkovachki (at) caltech (dot) edu

I am currently a final-year PhD student in applied mathematics at Caltech working on machine learning methods for the physical sciences in theory and practice. I work in collaboration with my advisor Prof. Andrew M. Stuart and have also collaborated with various experts in the fields of machine learning and physical modeling including Prof. Anima Anandkumar, Prof. Kaushik Bhattacharya, Prof. Maarten V. de Hoop, Prof. Youssef Marzouk, Prof. Tom Miller, and Prof. Siddhartha Mishra. I am originally from Sofia, Bulgaria but have lived in the US since 2005 (Atlanta → LA). I received a B.Sc. in mathematics from Caltech in 2016. I am a recipient of the 2020 Amazon AI4Science Fellowship which recognizes the outstanding work of graduate students in machine learning that impacts other scientific fields. My work has been written about in popular science magazines: MIT Technology Review, Quanta Magazine, and was recently highlighted in NVIDIA CEO Jensen Huang’s GTC 2021 keynote address.

My broader interest include anything mathematically beautiful or machine learning related. In particular, I am excited about the approximation theory of neural networks, the application of data-driven techniques to inverse problems, the theory and application of operator learning techniques for imaging and the computational sciences, and the development of uncertainty quantification techniques with deep neural networks. I am also interested in the large-scale deployment and integration of learning systems in super computers for more efficient physical simulations or in computationally limited hardware for consumer and commercial products.

I love spending my free time outdoors whenever possible. I particularly enjoy hiking, mountain biking, snowboarding, surfing, and skateboarding. I’ve also recently become an avid runner. Indoors, I love experimenting with new cooking techniques (recently fermentation), experiencing all forms of art and music and badly attempting to create my own, and exploring LA for new food and coffee. Last but certainly not least, I love spending time with my amazing dog.

I am currently on the job market! Please find my CV and contact me if you think I’d be a good fit for your team.

news

Nov 22, 2021 Our paper on universal approximation and error-rates for FNO(s) was accepted for publication in the Journal of Machine Learning Research.
Nov 11, 2021 I spoke (virtually) at the Rough Paths Interest Group at the Alan Turing Institute.
Nov 11, 2021 Our paper on data-driven, multiscale, materials modeling was accepted for publication in the journal on Mechanics of Materials.
Nov 6, 2021 Our new paper combining operator learning with physics-informed constraints is on the arXiv.
Oct 22, 2021 Our paper on learning for multiscale materials modeling was published in the Journal of the Mechanics and Physics of Solids.

selected publications

  1. arXiv
    Neural Operator: Learning Maps Between Function Spaces
    Kovachki, Nikola B, Li, Zongyi, Liu, Burigede, Azizzadenesheli, Kamyar, Bhattacharya, Kaushik, Stuart, Andrew M, and Anandkumar, Anima
    CoRR 2021
  2. arXiv
    Conditional Sampling with Monotone GANs
    Kovachki, Nikola B, Baptista, Ricardo, Hosseini, Bamdad, and Marzouk, Youssef
    CoRR 2020
  3. ICLR
    Fourier Neural Operator for Parametric Partial Differential Equations
    Li, Zongyi, Kovachki, Nikola B, Azizzadenesheli, Kamyar, Liu, Burigede, Bhattacharya, Kaushik, Stuart, Andrew M, and Anandkumar, Anima
    In 9th International Conference on Learning Representations (ICLR) 2021
  4. JMLR
    Continuous Time Analysis of Momentum Methods
    Kovachki, Nikola B, and Stuart, Andrew M
    Journal of Machine Learning Research 2021
  5. IP
    Ensemble Kalman Inversion: a Derivative-free Technique for Machine Learning Tasks
    Kovachki, Nikola B, and Stuart, Andrew M
    Inverse Problems 2019