A simple Twilio Video app showing how to build a video processor that replaces the background of a video stream.
virtual-background-demo.markbrouch.com
# install dependencies
$ yarn install
# serve with hot reload at localhost:3000
$ yarn dev
# build for production and launch server
$ yarn build
$ yarn start
# generate static project
$ yarn generate
The virtual background video processor pipes video frames through Google's MediaPipe Selfie Segmentation model and then uses Canvas API to composite the virtual background layers.
Check out my upcoming talk at JSConf Budapest to learn more about how I built this!