M-PhyGs: Multi-Material Object Dynamics from Video

Norika Wada, Kohei Yamashita, Ryo Kawahara, and Ko Nishino Kyoto University

Knowledge of the physical material properties governing the dynamics of a real-world object becomes necessary to accurately anticipate its response to unseen interactions. Existing methods for estimating such physical material parameters from visual data assume homogeneous single-material objects, pre-learned dynamics, or simplistic topologies. Real-world objects, however, are often complex in material composition and geometry lying outside the realm of these assumptions. In this paper, we particularly focus on flowers as a representative common object. We introduce Multi-material Physical Gaussians (M-PhyGs) to estimate the material composition and parameters of such multi-material complex natural objects from video. From a short video captured in a natural setting, M-PhyGs jointly segments the object into similar materials and recovers their continuum mechanical parameters while accounting for gravity. M-PhyGs achieves this efficiently with newly introduced cascaded 3D and 2D losses, and by leveraging temporal mini-batching. We introduce a dataset, Phlowers, of people interacting with flowers as a novel platform to evaluate the accuracy of this challenging task of multi-material physical parameter estimation. Experimental results on Phlowers dataset demonstrate the accuracy and effectiveness of M-PhyGs and its components.
  • M-PhyGs: Multi-Material Object Dynamics from Video
    N. Wada, K. Yamashita, R. Kawahara, and K. Nishino,
    [ arXiv ][ video ][ project ]

Video

M-PhyGs

overview
From dense multi-view images of a multi-material deformable object in a static state, we first recover a set of 3D Gaussians and uniformly distribute 3D particles inside the object. From a short video capturing physical interactions with the object captured from a sparse set of views, M-PhyGs estimates the physical material parameters (Young’s modulus and density) of these particles which drive the 3D Gaussians. This estimation is achieved by minimization of discrepancies between the predicted and observed dynamics first in 3D geometry by assuming local rigidity and then in the 2D image plane with full non-rigid dynamics.

Phlowers Dataset

We introduce a novel dataset, which we refer to as Phlowers dataset (physics of flowers) specifically focused on real flowers as a representative and challenging but natural multi-material object. Phlowers consists of real multi-view videos of 10 flowers. For each flower, we captured videos from 5 different viewpoints as a person inserts the flower into a flower frog. Each video contains at least 100 frames. The intrinsic and extrinsic camera parameters are estimated with COLMAP together with the dense view capture of the static scene. The coordinate scales and rotations are aligned using ChArUco boards, and the videos are synchronized using time code.

Results

Material Parameter Estimation

material visualization
Estimated physical material parameters of our method and existing methods. For OmniPhysGS, the constitutive model (ie, probability of whether the particle is elastic or not), and for gs-dynamics, the node features are shown, respectively.The estimated per-segment material parameters of M-PhyGs form clusters that roughly align with the different object parts.

In-Sequence Dynamics

Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
We visualize qualitative results of dynamics prediction for unseen interactions using the parameter estimates. In-sequence results show the dynamics prediction for the training data (the first 5 seconds) and its subsequent sequence (the last 4 or 5 seconds).

Cross-Sequence Dynamics

Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Observation
Ours (RGB / Young's Modulus)
Cross-sequence dynamics indicate a sequence distinct from the training data.

Comparison

Observation
Ours
OmniPhysGS
Observation
Ours
PhysDreamer
Observation
Ours
Pixie
Observation
Ours
Spring-Gaus
Observation
Ours
gs-dynamics
Observation
Ours
GIC