Ali Asghar Manjotho, Tekie Tsegay Tewolde, Ramadhani Ally Duma, Zhendong Niu.
The official PyTorch implementation of the paper "CoShMDM: Contact and Shape Aware Latent Motion Diffusion Model for Human Interaction Generation".
Please visit our webpage for more details.
This code was tested on Windows11 24H2 and requires:
- Python 3.8.0
- PyTorch 1.13.1+cu117
- conda3 or miniconda3
- Download ffmpeg from https://www.ffmpeg.org/download.html#build-windows
- Extract it in
C:\ffmpeg. - Add
C:\ffmpeg\bininPATHenvironment variable.
conda create -n coshmdm python==3.8.0
conda activate coshmdm
python -m spacy download en_core_web_sm
pip install -r requirements.txt
pip install trimesh h5py chumpy- Download dependencies:
bash protos/smpl_files.sh
bash protos/glove.sh
bash protos/t2m_evaluators.shDownload the data from webpage. And put them into ./data/.
<DATA-DIR>
./annots //Natural language annotations where each file consisting of three sentences.
./motions //Raw motion data standardized as SMPL which is similiar to AMASS.
./motions_processed //Processed motion data with joint positions and rotations (6D representation) of SMPL 22 joints kinematic structure.
./split //Train-val-test split.Run the shell script:
./prepare/download_pretrain_model.sh
./prepare/download_evaluation_model.shThis will download coshmdm.ckpt under .\checkpoints\ and bert.ckpt under .\eval_model.
Modify config files ./configs/model.yaml and ./configs/infer.yaml
In an intense boxing match, one is continuously punching while the other is defending and counterattacking.
With fiery passion two dancers entwine in Latin dance sublime.
Two fencers engage in a thrilling duel, their sabres clashing and sparking as they strive for victory.
The two are blaming each other and having an intense argument.
Two good friends jump in the same rhythm to celebrate.
Two people bow to each other.
Two people embrace each other.
...python -m tools.infer --text_prompt "In an intense boxing match, one is continuously punching while the other is defending and counterattacking." --num_repetitions 3python -m tools.infer --num_repetitions 5ppython -m tools.infer --num_repetitions 3 --text_file ./assets/sample_prompts.txtThe results will be rendered and put in ./results/ directory.
Modify config files ./configs/model.yaml ./configs/datasets.yaml and ./configs/train.yaml, and then run:
python tools/train.pyModify config files ./configs/model.yaml and ./configs/datasets.yaml
python tools/eval.py- Download and install blender https://www.blender.org/download/.
{VER}= your blender version, replace it accordingly.- Blender>Preferences>Interface> Check Developer Options
- Add the following paths to PATH environment variable.
C:\Program Files\Blender Foundation\Blender {VER}
C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin- Run CMD as Administrator and follow these commands:
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m ensurepip --upgrade
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install matplotlib --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install hydra-core --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install hydra_colorlog --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install shortuuid --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install omegaconf --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"
"C:\Program Files\Blender Foundation\Blender {VER}\{VER}\python\bin\python.exe" -m pip install moviepy==1.0.3 --upgrade --target="C:\Program Files\Blender Foundation\Blender {VER}\{VER}\scripts\modules"- To create SMPL mesh per frame run:
python -m visualize.render_mesh --input_path ./results/In_an_intense_boxing_match,_one_is_continuously_/ --repetition_num 0This script outputs:
p1_smpl_params.npyandp2_smpl_params.npy- SMPL parameters (thetas, root translations, vertices and faces)obj_rep###- Mesh per frame in.objformat.
After corresponding OBJ files for generated in P1 and P2 folders, you can install and use our blender addon to load the interaction animations in blender and render them. You can follow our following repository fot CoShMDM-Blender-Addon.
If you find our work helpful in your research, please consider citing the following paper:
@article{manjotho2025coshmdm,
title={CoShMDM: Contact and Shape-Aware Latent Motion Diffusion Model for Human Interaction Generation},
author={Ali Asghar Manjotho, Tekie Tsegay Tewolde, Ramadhani Ally Duma, Zhendong Niu},
journal={IEEE Transactions on Visualization and Computer Graphics},
year={2025},
publisher={IEEE}
}
This codebase is built upon the implementation of InterGen, and we would like to thank the authors for making their work publicly available. We also gratefully acknowledge the contributions and inspiration from TEMOS and MDM, which have been valuable references in the development of this project.
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
This repository is licensed under the Creative Commons BY-NC-SA 4.0 license. You are free to use, share, and modify the content for non-commercial use, provided that you cite our paper and clearly state any modifications made.





