-
Notifications
You must be signed in to change notification settings - Fork 11
Home
The Musical Gestures Toolbox for Python is a collection of high-level modules targeted at researchers working with video recordings. It includes visualization techniques such as motion videos, motion history images, and motiongrams; techniques that in different ways, allow for looking at video recordings from different temporal and spatial perspectives. It also includes basic computer vision analysis, such as extracting quantity and centroid of motion and using such features in analysis.
The toolbox was originally developed to analyze music-related body motion (musicians, dancers and perceivers), but is equally useful for other disciplines working with video recordings of humans, such as linguistics, pedagogy, psychology, and medicine.
For standard use you only need to install the package from Pypi. You can also clone the repository or download the latest version from the repository.
- Install Anaconda. This install Python and all the libraries needed to run MGT.
- Install FFmpeg, the multimedia toolbox that MGT builds on. Make sure that the FFmpeg binaries - ffmpeg, ffplay and ffprobe - are added to your system path. If you download FFmpeg as a compressed folder, after you extract it you can find the binaries in the /path/to/extracted_folder/bin folder. On Windows make sure you download the git-version, not the release version, as the latter has a problem with the normalize filter.
- Install the official version of MGT using pip:
pip install musicalgestures
. - Run the Jupyter Notebook that came with MGT to test that things work.
On Linux (at least Ubuntu), you need to have some GStreamer plugins for working with h264-compressed video files. You can install them with:
sudo apt-get install gstreamer1.0-plugins-base gstreamer1.0-plugins-ugly
Due to some problems with the Tkinter package you also need to install it manually using:
sudo apt-get install python3-tk
On Mac OS, you can install FFmpeg via Homebrew:
brew install ffmpeg
If there is an error with creating symbolic links during the installation, try fixing it via brew unlink vde
.
Additionally, you need to install wget to be able to download the OpenPose models:
brew install wget
The Musical Gestures Toolbox contains functions to analyze and visualize video, audio, and motion capture data. There are three categories of functions:
- Preprocesses (time-period extraction, time dilation, color management, etc)
- Processes (motion, history, optical flow, pose estimation, etc)
- Visualization functions (video playback, image display, plotting)
Look at the MusicalGesturesToolbox.ipynb to get an idea about how to use the toolbox. (Note that it might not display successfully online on GitHub due to its large size.)
These are some of the known issues. Please help improve the toolbox by adding bugs and feature requests in the issues section.
A project from the fourMs Lab, RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, Department of Musicology, University of Oslo.