Skip to content

Official PyTorch implementation for "CoShMDM: Contact and Shape Aware Latent Motion Diffusion Model for Human Interaction Generation".

Notifications You must be signed in to change notification settings

AliManjotho/coshmdm-code

Repository files navigation

CoShMDM: Contact and Shape-Aware Latent Motion Diffusion Model for Human Interaction Generation

Ali Asghar Manjotho, Tekie Tsegay Tewolde, Ramadhani Ally Duma, Zhendong Niu.

The official PyTorch implementation of the paper "CoShMDM: Contact and Shape Aware Latent Motion Diffusion Model for Human Interaction Generation".

Please visit our webpage for more details.

teaser

model

Getting started

This code was tested on Windows11 24H2 and requires:

  • Python 3.8.0
  • PyTorch 1.13.1+cu117
  • conda3 or miniconda3

1. Setup FFMPEG

2. Setup miniconda environment

conda create -n coshmdm python==3.8.0
conda activate coshmdm
python -m spacy download en_core_web_sm
pip install -r requirements.txt
pip install trimesh h5py chumpy
  • Download dependencies:
bash protos/smpl_files.sh
bash protos/glove.sh
bash protos/t2m_evaluators.sh

3. Get datasets

Download the data from webpage. And put them into ./data/.

Data Structure

<DATA-DIR>
./annots                //Natural language annotations where each file consisting of three sentences.
./motions               //Raw motion data standardized as SMPL which is similiar to AMASS.
./motions_processed     //Processed motion data with joint positions and rotations (6D representation) of SMPL 22 joints kinematic structure.
./split                 //Train-val-test split.

Demo

1. Download checkpoints and evaluation models

Run the shell script:

./prepare/download_pretrain_model.sh
./prepare/download_evaluation_model.sh

This will download coshmdm.ckpt under .\checkpoints\ and bert.ckpt under .\eval_model.

2. Modify the configs

Modify config files ./configs/model.yaml and ./configs/infer.yaml

3. Modify the input file ./prompts.txt like:

In an intense boxing match, one is continuously punching while the other is defending and counterattacking.
With fiery passion two dancers entwine in Latin dance sublime.
Two fencers engage in a thrilling duel, their sabres clashing and sparking as they strive for victory.
The two are blaming each other and having an intense argument.
Two good friends jump in the same rhythm to celebrate.
Two people bow to each other.
Two people embrace each other.
...

4. Interaction Generation

4.1 Generate from a single prompt

python -m tools.infer --text_prompt "In an intense boxing match, one is continuously punching while the other is defending and counterattacking." --num_repetitions 3

4.2 Generate from test set prompts (prompts.txt)

python -m tools.infer --num_repetitions 5

4.3 Generate from custom text file

ppython -m tools.infer --num_repetitions 3 --text_file ./assets/sample_prompts.txt

The results will be rendered and put in ./results/ directory.

Train

Modify config files ./configs/model.yaml ./configs/datasets.yaml and ./configs/train.yaml, and then run:

python tools/train.py

Evaluation

1. Modify the configs

Modify config files ./configs/model.yaml and ./configs/datasets.yaml

2. Run

python tools/eval.py

Rendering SMPL meshes in Blender

  • Download and install blender https://www.blender.org/download/.
  • {VER} = your blender version, replace it accordingly.
  • Blender>Preferences>Interface> Check Developer Options
  • Add the following paths to PATH environment variable.
C:\Program Files\Blender Foundation\Blender {VER}
C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin
  • Run CMD as Administrator and follow these commands:
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m ensurepip --upgrade
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install matplotlib --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install hydra-core --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install hydra_colorlog --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install shortuuid --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install omegaconf --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install moviepy==1.0.3 --upgrade  --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
  • To create SMPL mesh per frame run:
python -m visualize.render_mesh --input_path ./results/In_an_intense_boxing_match,_one_is_continuously_/ --repetition_num 0

This script outputs:

  • p1_smpl_params.npy and p2_smpl_params.npy - SMPL parameters (thetas, root translations, vertices and faces)
  • obj_rep### - Mesh per frame in .obj format.

Blender Addon for CoShMDM

After corresponding OBJ files for generated in P1 and P2 folders, you can install and use our blender addon to load the interaction animations in blender and render them. You can follow our following repository fot CoShMDM-Blender-Addon.

teaser teaser teaser

Citation

If you find our work helpful in your research, please consider citing the following paper:

@article{manjotho2025coshmdm,
  title={CoShMDM: Contact and Shape-Aware Latent Motion Diffusion Model for Human Interaction Generation},
  author={Ali Asghar Manjotho, Tekie Tsegay Tewolde, Ramadhani Ally Duma, Zhendong Niu},
  journal={IEEE Transactions on Visualization and Computer Graphics},
  year={2025},
  publisher={IEEE}
}

Acknowledgements

This codebase is built upon the implementation of InterGen, and we would like to thank the authors for making their work publicly available. We also gratefully acknowledge the contributions and inspiration from TEMOS and MDM, which have been valuable references in the development of this project.

Licenses

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

This repository is licensed under the Creative Commons BY-NC-SA 4.0 license. You are free to use, share, and modify the content for non-commercial use, provided that you cite our paper and clearly state any modifications made.

About

Official PyTorch implementation for "CoShMDM: Contact and Shape Aware Latent Motion Diffusion Model for Human Interaction Generation".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published