- Prepare demo gif and link for your mobile machine learning project.
I recommend the width of gif is 320.
If you want to know how to generate gif demo, check this tutorial.
- Make an issue before PR
I will prepare the issue guidelines, but not yet.
- Fork Awesome Machine Learning DEMOs with iOS repository.
- Add your demo gif and link.
- Make PR.
- Record DEMO video on iOS.
→ How to record the screen on your iPhone, iPad, or iPod touch - Export the video to mac.
- Convert the video to gif using following script.
- Install FFMPEG on your mac.
→ Run the following command on Terminal app.
$ brew install ffmpeg
- Download the script(
gifconverter.sh
).
# echo $1
video_path=$1
directory_path=`dirname "$1"`
file=`basename "$1"`
extension="${file##*.}"
filename="${file%.*}"
echo $extension
gif_path="${video_path%.*}.gif"
echo $gif_path
ffmpeg -y -i $video_path -vf fps=8,scale=320:-1:flags=lanczos,palettegen palette.png;
ffmpeg -y -i $video_path -i palette.png -filter_complex "fps=8,scale=320:-1:flags=lanczos[x];[x][1:v]paletteuse" $gif_path;
- Run the script with the demo video path.
$ sh gifconverter.sh ~/Desktop/DEMO.mp4
- You can find
DEMO.gif
on~/Desktop
.
Show output for each input? Drawing detail of result? Test for debugging?
- Pose Estimation: draw dot each point and joint, print confidence each point.
- ...
Analyze outputs from a bunch of inputs
- average of inference time and fps
- accumulate execution time, fps...?
- rendering time
- total execution time
- ...