This app uses audio captured by phone/laptop microphone and predicts the piano chords which user plays. It uses Teachable Machine to train the model and ml5.js for predicting the results. Webpack is used for bundling the application and Monaca for mobile deployment.
piano_chords_preview.mp4
There is a tutorial available in Medium via this link.
- Download the project.
- Run
npm install
in the directory. - Run
npm run dev
to start the project. - If the browser opens url 0.0.0.0:8080, change it to localhost:8080.
- Wait until the model loads.
- When the predicting page appears, play the chord written on the card.
- If you do not know the chord, tap the card or Reveal button to see image of chord.
- After successful play of chord, the card will turn green and next one will appear.
hybridFunctions.js
- functions for starting microphone and predictions
LoadingPage.vue
- first screen that user sees while loading the modelPredictingPage.vue
- main screen where predicting is happening
There is a webpack bundler setup. It compiles and bundles all "front-end" resources. You should work only with files located in /src
folder. Webpack config located in script/webpack.config.js
.
Webpack has specific way of handling static assets (CSS files, images, audios). You can learn more about correct way of doing things on official webpack documentation.