Skip to content

Commit b448017

Browse files
committed
Init
1 parent 5354c10 commit b448017

File tree

9 files changed

+867
-2
lines changed

9 files changed

+867
-2
lines changed

.gitignore

+145
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,145 @@
1+
# Customized
2+
.vscode
3+
data
4+
wandb
5+
results
6+
work_dir*
7+
ft_local
8+
demo.sh
9+
examples
10+
11+
*.pth
12+
*.zip
13+
*.pkl
14+
15+
# Byte-compiled / optimized / DLL files
16+
__pycache__/
17+
*.py[cod]
18+
*$py.class
19+
*.pyc
20+
21+
# C extensions
22+
*.so
23+
24+
# Distribution / packaging
25+
.Python
26+
build/
27+
develop-eggs/
28+
dist/
29+
downloads/
30+
eggs/
31+
.eggs/
32+
lib/
33+
lib64/
34+
parts/
35+
sdist/
36+
var/
37+
wheels/
38+
pip-wheel-metadata/
39+
share/python-wheels/
40+
*.egg-info/
41+
.installed.cfg
42+
*.egg
43+
MANIFEST
44+
45+
# PyInstaller
46+
# Usually these files are written by a python script from a template
47+
# before PyInstaller builds the exe, so as to inject date/other infos into it.
48+
*.manifest
49+
*.spec
50+
51+
# Installer logs
52+
pip-log.txt
53+
pip-delete-this-directory.txt
54+
55+
# Unit test / coverage reports
56+
htmlcov/
57+
.tox/
58+
.nox/
59+
.coverage
60+
.coverage.*
61+
.cache
62+
nosetests.xml
63+
coverage.xml
64+
*.cover
65+
*.py,cover
66+
.hypothesis/
67+
.pytest_cache/
68+
69+
# Translations
70+
*.mo
71+
*.pot
72+
73+
# Django stuff:
74+
*.log
75+
local_settings.py
76+
db.sqlite3
77+
db.sqlite3-journal
78+
79+
# Flask stuff:
80+
instance/
81+
.webassets-cache
82+
83+
# Scrapy stuff:
84+
.scrapy
85+
86+
# Sphinx documentation
87+
docs/_build/
88+
89+
# PyBuilder
90+
target/
91+
92+
# Jupyter Notebook
93+
.ipynb_checkpoints
94+
95+
# IPython
96+
profile_default/
97+
ipython_config.py
98+
99+
# pyenv
100+
.python-version
101+
102+
# pipenv
103+
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
104+
# However, in case of collaboration, if having platform-specific dependencies or dependencies
105+
# having no cross-platform support, pipenv may install dependencies that don't work, or not
106+
# install all needed dependencies.
107+
#Pipfile.lock
108+
109+
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
110+
__pypackages__/
111+
112+
# Celery stuff
113+
celerybeat-schedule
114+
celerybeat.pid
115+
116+
# SageMath parsed files
117+
*.sage.py
118+
119+
# Environments
120+
.env
121+
.venv
122+
env/
123+
venv/
124+
ENV/
125+
env.bak/
126+
venv.bak/
127+
128+
# Spyder project settings
129+
.spyderproject
130+
.spyproject
131+
132+
# Rope project settings
133+
.ropeproject
134+
135+
# mkdocs documentation
136+
/site
137+
138+
# mypy
139+
.mypy_cache/
140+
.dmypy.json
141+
dmypy.json
142+
143+
# Pyre type checker
144+
.pyre/
145+
core-python*

LICENSE

+176
Large diffs are not rendered by default.

README.md

100644100755
+162-2
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,162 @@
1-
# AMT
2-
Official code for "AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation" (CVPR2023)
1+
# AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation
2+
3+
4+
This repository contains the official implementation of the following paper:
5+
> **AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation**<br>
6+
> [Zhen Li](https://paper99.github.io/)<sup>\*</sup>, [Zuo-Liang Zhu](https://nk-cs-zzl.github.io/)<sup>\*</sup>, [Ling-Hao Han](https://scholar.google.com/citations?user=0ooNdgUAAAAJ&hl=en), [Qibin Hou](https://scholar.google.com/citations?hl=en&user=fF8OFV8AAAAJ&view_op=list_works), [Chun-Le Guo](https://scholar.google.com/citations?hl=en&user=RZLYwR0AAAAJ), [Ming-Ming Cheng](https://mmcheng.net/)<br>
7+
> (\* denotes equal contribution) <br>
8+
> Nankai University <br>
9+
> In CVPR 2023<br>
10+
11+
[[Paper](https://github.com/MCG-NKU/E2FGVI)]
12+
[[Project Page](https://nk-cs-zzl.github.io/projects/amt/index.html)]
13+
[[Web demos](#web-demos)]
14+
[Video]
15+
16+
AMT is a **lightweight, fast, and accurate** algorithm for Frame Interpolation.
17+
It aims to provide practical solutions for **video generation** from **a few given frames (at least two frames)**.
18+
19+
![Demo gif](assets/amt_demo.gif)
20+
* More examples can be found in our [project page](https://nk-cs-zzl.github.io/projects/amt/index.html).
21+
22+
## Web demos
23+
Integrated into [Hugging Face Spaces 🤗](https://huggingface.co/spaces) using [Gradio](https://github.com/gradio-app/gradio). Try out the Web Demo: [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/johngoad/frame-interpolation)
24+
25+
Try AMT to interpolate between two or more images at [![PyTTI-Tools:FILM](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1IeVO5BmLouhRh6fL2z_y18kgubotoaBq?usp=sharing)
26+
27+
28+
## Change Log
29+
- **Apr 20, 2023**: Our code is publicly available.
30+
31+
32+
## Method Overview
33+
![pipeline](https://user-images.githubusercontent.com/21050959/229420451-65951bd0-732c-4f09-9121-f291a3862d6e.png)
34+
35+
For technical details, please refer to the [method.md](docs/method.md) file, or read the full report on [arXiv]().
36+
37+
## Dependencies and Installation
38+
1. Clone Repo
39+
40+
```bash
41+
git clone https://github.com/MCG-NKU/AMT.git
42+
```
43+
44+
2. Create Conda Environment and Install Dependencies
45+
46+
```bash
47+
conda env create -f environment.yaml
48+
conda activate amt
49+
```
50+
3. Download pretrained models for demos from [Pretrained Models](#pretrained-models) and place them to the `pretrained` folder
51+
52+
## Quick Demo
53+
54+
**Note that the selected pretrained model (`[CKPT_PATH]`) needs to match the config file (`[CFG]`).**
55+
56+
> Creating a video demo, increasing $n$ will slow down the motion in the video. (With $m$ input frames, `[N_ITER]` $=n$ corresponds to $2^n\times (m-1)+1$ output frames.)
57+
58+
59+
```bash
60+
python demos/demo_2x.py -c [CFG] -p [CKPT] -n [N_ITER] -i [INPUT] -o [OUT_PATH] -r [FRAME_RATE]
61+
# e.g. [INPUT]
62+
# -i could be a video / a regular expression / a folder contains multiple images
63+
# -i demo.mp4 (video)/img_*.png (regular expression)/img0.png img1.png (images)/demo_input (folder)
64+
65+
# e.g. a simple usage
66+
python demos/demo_2x.py -c cfgs/AMT-S.yaml -p pretrained/amt-s.pth -n 6 -i assets/quick_demo/img0.png assets/quick_demo/img1.png
67+
68+
```
69+
70+
+ Note: Please enable `--save_images` for saving the output images (Save speed will be slowed down if there are too many output images)
71+
+ Input type supported: `a video` / `a regular expression` / `multiple images` / `a folder containing input frames`.
72+
+ Results are in the `[OUT_PATH]` (default is `results/2x`) folder.
73+
74+
## Pretrained Models
75+
76+
<p id="Pretrained"></p>
77+
78+
<table>
79+
<thead>
80+
<tr>
81+
<th> Dataset </th>
82+
<th> :link: Download Links </th>
83+
<th> Config file </th>
84+
<th> Trained on </th>
85+
<th> Arbitrary/Fixed </th>
86+
</tr>
87+
</thead>
88+
<tbody>
89+
<tr>
90+
<td>AMT-S</td>
91+
<th> [<a href="https://drive.google.com/file/d/1WmOKmQmd6pnLpID8EpUe-TddFpJuavrL/view?usp=share_link">Google Driver</a>][<a href="https://pan.baidu.com/s/1yGaNLeb9TG5-81t0skrOUA?pwd=f66n">Baidu Cloud</a>]</th>
92+
<th> [<a href="cfgs/AMT-S.yaml">cfgs/AMT-S</a>] </th>
93+
<th>Vimeo90k</th>
94+
<th>Fixed</th>
95+
</tr>
96+
<tr>
97+
<td>AMT-L</td>
98+
<th>[<a href="https://drive.google.com/file/d/1UyhYpAQLXMjFA55rlFZ0kdiSVTL7oU-z/view?usp=share_link">Google Driver</a>][<a href="https://pan.baidu.com/s/1qI4fBgS405Bd4Wn1R3Gbeg?pwd=nbne">Baidu Cloud</a>]</th>
99+
<th> [<a href="cfgs/AMT-L.yaml">cfgs/AMT-L</a>] </th>
100+
<th>Vimeo90k</th>
101+
<th>Fixed</th>
102+
</tr>
103+
<tr>
104+
<td>AMT-G</td>
105+
<th>[<a href="https://drive.google.com/file/d/1yieLtKh4ei3gOrLN1LhKSP_9157Q-mtP/view?usp=share_link">Google Driver</a>][<a href="https://pan.baidu.com/s/1AjmQVziQut1bXgQnDcDKvA?pwd=caf6">Baidu Cloud</a>]</th>
106+
<th> [<a href="cfgs/AMT-G.yaml">cfgs/AMT-G</a>] </th>
107+
<th>Vimeo90k</th>
108+
<th>Fixed</th>
109+
</tr>
110+
<tr>
111+
<td>AMT-S</td>
112+
<th>[<a href="https://drive.google.com/file/d/1f1xAF0EDm-rjDdny8_aLyeedfM0QL4-C/view?usp=share_link">Google Driver</a>][<a href="https://pan.baidu.com/s/1eZtoULyduQM8AkXeYEBOEw?pwd=8hy3">Baidu Cloud</a>]</th>
113+
<th> [<a href="cfgs/AMT-S_gopro.yaml">cfgs/AMT-S_gopro</a>] </th>
114+
<th>GoPro</th>
115+
<th>Arbitrary</th>
116+
</tr>
117+
</tbody>
118+
</table>
119+
120+
## Training and Evaluation
121+
122+
Please refer to [develop.md](docs/develop.md) to learn how to benchmark the AMT and how to train a new AMT model from scratch.
123+
124+
125+
## Citation
126+
If you find our repo useful for your research, please consider citing our paper:
127+
128+
```bibtex
129+
@inproceedings{licvpr23amt,
130+
title={AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation},
131+
author={Li, Zhen and Zhu, Zuo-Liang and Han, Ling-Hao and Hou, Qibin and Guo, Chun-Le and Cheng, Ming-Ming},
132+
booktitle={IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
133+
year={2023}
134+
}
135+
```
136+
137+
138+
## License
139+
This code is licensed under the [Creative Commons Attribution-NonCommercial 4.0 International](https://creativecommons.org/licenses/by-nc/4.0/) for non-commercial use only.
140+
Please note that any commercial use of this code requires formal permission prior to use.
141+
142+
## Contact
143+
144+
For technical questions, please contact `zhenli1031[AT]gmail.com` and `nkuzhuzl[AT]gmail.com`.
145+
146+
For commercial licensing, please contact `cmm[AT]nankai.edu.cn`
147+
148+
## Acknowledgement
149+
150+
We thank Jia-Wen Xiao, Zheng-Peng Duan, Rui-Qi Wu, and Xin Jin for proof reading.
151+
We thank [Zhewei Huang](https://github.com/hzwer) for his suggestions.
152+
153+
Here are some great resources we benefit from:
154+
155+
- [IFRNet](https://github.com/ltkong218/IFRNet) and [RIFE](https://github.com/megvii-research/ECCV2022-RIFE) for data processing, benchmarking, and loss designs.
156+
- [RAFT](https://github.com/princeton-vl/RAFT), [M2M-VFI](https://github.com/feinanshan/M2M_VFI), and [GMFlow](https://github.com/haofeixu/gmflow) for inspirations.
157+
- [FILM](https://github.com/google-research/frame-interpolation) for Web demo reference.
158+
159+
160+
**If you develop/use AMT in your projects, welcome to let us know. We will list your projects in this repository.**
161+
162+
We also thank potential future contributors.

assets/amt_demo.gif

12.8 MB
Loading

assets/quick_demo/img0.png

1.17 MB
Loading

assets/quick_demo/img1.png

1.16 MB
Loading

0 commit comments

Comments
 (0)