Annonce

Les commentaires sont clos.

Recovering Garment Dynamic Motion from Videos

3 Mai 2024


Catégorie : Doctorant


The LIRIS team at Ecole Centrale de Lyon, France is looking for a PhD student to work on garment modeling. The goal is to learn how garments dynamics from videos by using deformable 3D reconstruction techniques to obtain the 3D dynamic motion depicted in the videos.

The PhD will be supervised by Dr. Shaifali Parashar (shaifali.parashar@liris.cnrs.fr).
 
Requirements:
  1. PhD in computer vision and mathematics with a strong publishing record
  2. Strong programming skills in C++ and python
  3. Fluency in English

Project duration: 36 months

Tentative start date: September 2024

How to apply:

Please send your CV, transcripts and two letters of recommendations to shaifali.parashar@liris.cnrs.fr with the subject "PhD position in Recovering Garment Dynamic Motion from Videos".

 

Modelling digital garments is a challenging yet important problem to solve. Many real-time applications, such as video game simulations, fashion design and e-commerce virtual try-on systems require realistic modelling of digital garments. Traditional cloth simulators [1] use physics-based modelling to simulate cloth deformations. These methods are prohibitively slow and therefore, not apt for real-time applications. With the advent of deep learning, this field has progressed tremendously in the past decade. [2] showed that fast cloth simulations can be achieved by learning cloth deformations using supervised learning. Recent methods, such as [3], have proposed self-supervised learning of cloth deformations; thus overcoming dependency on the high amount of 3D ground truth data requirement.

However, the garment dynamics are still unrealistically modelled. This is due to the fact that garment dynamics are controlled by many forces such as gravity, impact, stretch, friction, wind, etc which aredifficult to model independently. This project aims to explore the effects of different forces on a rangeof garments made from various materials. We shall gather video footage, which can be capturedconveniently with smartphones or other inexpensive cameras, or sourced from online platforms. Ourgoal is to analyse how garments behave dynamically in different scenarios, considering factors like motion, material type, and applied forces. To achieve this, we will utilise deformable 3D reconstruction techniques, such as [4], to analyse the dynamic motion of garments depicted in the videos. By doing so, we aim to gain insights into garment dynamics without relying heavily onextensive 3D ground truth data.

[1] Narain et al, Siggraph Asia 2012. Adaptive anisotropic remeshing for cloth simulation.

[2] Gundogdu et al, TPAMI 2020. Garnet++: Improving Fast and Accurate Static 3D Cloth Drapingby Curvature Loss.

[3] Chen et al, 3DV 2024. GAPS: Geometry-Aware, Physics-Based, Self-Supervised Neural Garment Draping.

[4] Parashar et al, TPAMI 2024. A Closed-Form, Pairwise Solution to Local Non-Rigid Structure-from-Motion.