Skip to content

Lighten001/InterMoE

Repository files navigation

InterMoE: Individual-Specific 3D Human Interaction Generation via Dynamic Temporal-Selective MoE

This repository contains the official implementation for the paper: InterMoE: Individual-Specific 3D Human Interaction Generation via Dynamic Temporal-Selective MoE (AAAI-26).

TODO

  • Evaluation codes.
  • Inference codes.
  • Checkpoints on InterHuman datasets.
  • Training codes.
  • Visualization codes.
  • Checkpoints on Inter-X datasets.

Getting started

This code was tested on Ubuntu 20.04 LTS and requires:

  • Python 3.10
  • conda3 or miniconda3
  • CUDA capable GPU

1. Setup environment

conda create -n intermoe python=3.10
conda activate intermoe
pip install -r requirements.txt

2. Get datasets and some miscellaneous.

  1. InterHuman: Download the InterHuman dataset from their official webpage, and put them into data/InterHuman. Then download the model for evaluation from this script.

  2. Inter-X: Download the Inter-X dataset from their official repo. And put them into data/InterX.

And finally the data structure should be like:

data/InterHuman/
    annots/
    motions/
    motions_processed/
    split/
    ...

data/InterX/
    h5/
    processed/
    splits/
    text2motion/
    ...

eval_model/
  ...

Inference

1. Download the checkpoint

Download checkpoints from Google drive, unzip the compressed file and put them under the checkpoints folder.

For example, it will be like:

checkpoints/
  intermoe-interhuman
    model/
      ...
    config.yaml

2. Modify the input file ./prompts.txt like:

The two are blaming each other and having an intense argument.
Two fencers engage in a thrilling duel, their sabres clashing and sparking as they strive for victory.
Two individuals are practicing tai chi.
Two people bow to each other.
In an intense boxing match, one is continuously punching while the other is defending and counterattacking.
...

3. Run

Modify config files (model.vae_ckpt and model.CHECKPOINT) in CFG_PATH and run:

python infer.py --cfg ${CFG_PATH}

# for example, the CFG_PATH can be checkpoints/intermoe-interhuman/config.yaml.

The results will be in the GENERAL.CHECKPOINT/GENERAL.EXP_NAME/results sub-folder.

Train

Coming soon.

Evaluation

Modify config files (model.vae_ckpt and model.CHECKPOINT) in CFG_PATH and run:

# for interhuman
python eval_interhuman.py --cfg ${CFG_PATH}

# for interx
python eval_interx.py --cfg ${CFG_PATH}

To only evaluate the VAE, modify config files (model.CHECKPOINT) in VAE_CFG_PATH and run:

python eval_interhuman.py --cfg ${VAE_CFG_PATH}

Acknowledgement

We appreciate the open source of the following projects:

InterGen, Inter-X, salad, motion-latent-diffusion, motion-diffusion-model, etc.

Citation

If you find our work useful in your research, please consider citing:

@article{}

Licenses

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

About

The official implementation for InterMoE.

Topics

Resources

License

Stars

Watchers

Forks

Languages