- Normalise all the values between 0-1
- Rhythmic Pattern of sending option
- CC (control) Continuous values and Note on and note off for other things
- Control channel gives you a continuous set of data
- Next work -- Meet in the lab, look at how this works, with Reaper and the surround sound
- Take two streams, lets take the X and Y
- TASK FOR NEXT WEEK -
- Multiple streams of data
- MIDI - Same stream, same channel
- Ideas on how to deal with high amount of datasets
Steps for lab -
- White switch (main power on),then the speakers, then the red connector thingy
- Focusrite scarlett 18i 20 (18 inputs 20 outputs)
- Device settings (Monitor control, set to All)
- Rest of the stuff to be default
- Output Routing
- Each monitor to be connected to mono playbacks of the same number
- Monitor controls change the volume of everything together
For reaper -
- Option > preferances > Audio/Device (Focusrite USB ASIO)
- Output range set to 8.
- Routing icon (Deactivate the master set, which will mix everything down to the stereo)
- We need mono for each and every channel
Plugins -
- ReLearn - Helgo Boss
- Add mapping
- Edit
- CC Id selection and FX (SpatGri)
- Mappings like Min and Max
- Check what s is
- SpatGri
- GRM is Paid (only 1 min is there)
- Fun fact -- GRM is the first studio in France where this kind of thing happend
Speaker Setup
- French 8 (In the UK)
- US uses the French style
For the issues --
- Write that question on the message boards, the developers or write a script.
- Sir initially explains the hierarchy of transfer
- Ports
- Channel
- Different IDs per channel (the CC values)
- Sir explains ReaLearn and how to change the mapping for these and then details
- Three different ideas
- Note-on and note-off
- CCes
- Tuples
- Just In Time programming paradigm (for fixing the time sleep issue)
- Just for now, look at CC and check for 3 channels
- Generative Music/ Algorithmic Music (change a few more pieces)
- Check how to catch live data in the screen (MIDI-Ox for Windows)
- SNOIZE for Mac -- MIDIMonitor
- Check out whether to send sequential data to the MIDI.
- Send only single value as of now.
- Now, take any audio file and control the Reverb on it using Image data
- After that, can do some experiments using QNeo
- Add every stream from 0 to 1
- Makes modules interchangeable
- Add the changes realtime
- GUI clock
- Check out small clock changes in test3.py (try to make the change live)
- Set the loop to 1
- change slider values and dynamically send the data
- Before next week, try controlling the Reverb of an audio
- Our ears are most sensitive to pitch
- Try controlling the pitch
- Find a Free VST that can change the Sine wave, put it in the Loop and try changing the pitch
- ReaPitch changes
- Tone oscillators
- Just one parameter of this pitch
- Crystal clear, is exactly what we can hear with the sine waves
- Tried the Panning with the lab
- Trying looking up Dr. Norah Lorway (Check her out!)
- Working Demo for a presentation
- Start with the UI
- Combine with the Video version of the UI
- Microfone real-time live input (with Reaper) cool idea!!
- How does it effect other distortions (like guitar)
- Gave sir the idea of non-linear mapping for music using Deep Learning
- Will be done later
- Explore Reaper to test which parameters to use in which VSTs
- If we demo something, it should be crystal clear - as to what is happening
- What conversion of data should we be needed
- The additional technical deets should be influenced by the creative need
- Check Native Instruments VST - Kontakt
- Intermediate meeting called by Sir
- A functional UI with basic details
- By the end of the term -
- UI
- Documentation
- Website
- Downloadable version
- Demo version online
- (??) Lab Website