Meiyi Li
University of Texas at Austin
meiyil@utexas.edu
Bio
Yifei Li is a Ph.D. candidate in EECS at MIT CSAIL in the Computational Design and Fabrication Group, advised by Prof. Wojciech Matusik. Her research lies at the intersection of computer graphics, differentiable physical simulation, computational design, and machine learning, with a focus on hybrid neural-physics methods for self-adaptive physical systems. She is the recipient of the MIT Stata Family Presidential Fellowship and was named to Rising Stars in Computer Graphics. Yifei received a B.S. in Computer Science from Carnegie Mellon University. She has held research internships at Meta Reality Labs, NVIDIA, Boston Dynamics AI Institute, and Facebook AI Research.
Areas of Research
- Electromagnetics and Energy
Trustworthy AI in Energy Systems: Shaping the Energy-Intelligence Future
Next-generation engineered systems, from artificial hearts that adapt pumping waveforms to medical sleeves that reshape fabric panels around healing limbs, must continually update geometry, material parameters, and control policies in response to streaming data. Achieving this vision requires computational frameworks that couple high-fidelity continuum mechanics with data-driven learning. Classical simulators offer physical fidelity but are slow and non-differentiable, blocking inverse design and real-time adaptation. Purely neural surrogates are fast and differentiable, yet demand large datasets and often drift from governing laws. My research develops hybrid neural-physics methods that retain first-principles accuracy while exposing exact gradients, bridging simulation and machine learning for adaptive continuum systems.
The first step was to establish that high-fidelity solvers could be made differentiable while preserving physical laws. By back-propagating exact derivatives in deformable solids and enforcing fluid dynamics in topology optimization, I demonstrated that one can perform large-scale, gradient-based design in both cloth and fluidic systems. These solvers also unlocked an interface for neural design and control with physics knowledge. By backpropagating the gradients to neural network parameters, these systems enabled tasks such as artificial-heart design and control. In parallel, I developed pipelines for real-to-sim tasks, reconstructing body shape, fabric, and material parameters directly from observations to yield digital humans whose motion and drape align with reality.
These advances naturally motivated the need for modularity: an engine that allows physics-based components and neural surrogates to coexist within one framework. My recent work on Neural Modular Physics introduces such a composable architecture, conserving physical invariants while enabling hybridization where models are uncertain.
Looking forward, I aim to generalize this agenda into a unified differentiable multi-physics framework that couples solids, fluids, and deformables, while maintaining robust sim-to-real pipelines. This hybrid approach transforms simulation into a continuously learning, self-redesigning engine for adaptive physical systems.