Draw <--> Sound <--> Live is an application that allows users to either perform, or watch the performance of, a user who draws music and sound. That is, they create a sound performance by doing art and drawing with their mouse. Our digital space is inspired by Soundspace at Durham Science Museum, which merges movement, sound, and visual art.
The app provides an endless participatory experience in which performers interact with the canvas DOM. They have a canvas on the screen--an instrument--which allows them to manipulate a sound track and generate sound based on input events (eg. mouse location, direction, velocity) which is both seen and heard by viewers. Each movement is drawn and viewable. The underlying rules are not explicit to the users. Simply, they are sketching “music”.
Viewers can rate and comment on the performance. Performers stats (eg. # of performances, ratings) are displayed.
The main landing page of the app has a "Live" button, which re-directs the user to Google Authenticaion screen to validate their credentials.
Landing Page of Draw <--> Sound <--> Live

The login page will offer an options to proceed.
Authentication Page of Draw <--> Sound <--> Live

Once authenticated, the user is then directed to the Venue. In the venue, the user is presented with the stage, a chat area and a button "Get On Stage" to allow any audience member to perform.
In the Venue, waiting for Performer

If an audience member decides to get on stage, they can then explore the canvas and keyboard and begin to create. Draw. Sound. Live.
During the performance, the audience is encouraged to express their feelings/emotions via applause, booing or chat. This info persists with the performer, and at the end of the performance is added to their lifetime totals. Note that performers that receive too many "boos" are kicked off stage.
Behind the scenes there are many moving parts. The venue contains various components that interact with each other. The primary composition element is the canvas. It renders the performer's mouse movements into a drawing that manipulates settings of the sound effects. The result is a continously evolving visual-musical performance art experience, where the performer is invited to explore the possibilities of the canvas and the "instruments" that are hidden within.
Looking deeper within the app, socket.io is used for bi-directional, event-based realtime communications which enable the performer to project their performance to the audience. It also provides a means for chat, so the audience can have an ongoing discussion and convey their sentiments.
The application is based on React, which encourages reusable UI components with dynamically rendered views. To better understand the organization, the app has a root, called App, which has 3 main components: Login, Landing and Venue. Within this structure the state of the participants and their interactions is managed.
You can run a hosted version on Heroku here.
Alternatively you can install and run the app locally. Do do this, first you have to clone the git repo and then install.
git clone https://github.com/DSEapps/draw-sound-live.git
cd draw-sound-live
yarn install
cd client
yarn install
cd ..
Next, MongoDB is required for user data. Install requires the following:
For Mac:
brew install mongodb
Open two separate terminals and type in the first terminal:
mongod
Type the following in the second:
mongo
Note: You will need to include a .env file in your root directory with GOOGLE_CLIENT_ID & GOOGLE_CLIENT_SECRET variables which will need to be obtained from Google. You will also need to specify in your Google Dev Dashboard the origin and callback routes from the application.
After both installations complete, run the following command in your terminal:
yarn start
That's it, your app should be running on https://localhost:3000. The server will proxy requests from the client on https://localhost:3001. To simulate multiple clients, open additional broswer tabs to view performances or get on the stage.
Our application starts off with the basic MERN components (MongoDB, Express, React and Node).
References:
Our application just needed a basic authentication component as we're primarily interested in obtaining a user's name and a unique id for tracking the user's participation on our app. We decided to accomplish this using React-Google-Login.
In additional to MERN, Draw - Sound - Live also relied heavily on the following technologies for visual and audio affects.




