Tools for non-realtime Rendering of generative content in Max/MSP/Jitter Written by Tim Heinze © 2020, www.xenorama.com
Provide a set of abstractions which support offline, frame-by-frame, hiQ rendering of generative, interactive content built in Max/MSP/Jitter. Designed to be implemented as intuitively as possible in any previously developed patching environment, limited only by the latter's complexity and a handful of currently unsupported methods. Note that these are not externals but abstractions and can be modified freely.
- audio-reactive
- timing-sensitive
- realtime
- timelines
- Cycling '74 Max, V. 8.1.8
- jit.world context
- donload the ZIP-file
- add the resulting folder to Max's searchpath, i.e. the packages folder
- open Max, under the the Extras menu, select the entry the.oneirotomy which should appear
- Read the documentation to get an overview and scan the helpfiles
- check the helpfiles and reference pages of all objects to get a detailed overview, especially the.jit.thalamus lying at the core of the process
- check also the limitations of certain work-flows and objects, as not all functionality can be provided natively and/or instantaneously
- generally, consider which parts of your patch translate from timing-sensitive or signal domain to video domain
- instantiate the.oneirotomy.setup to highlight relevant objects, upgade them to compatibility and/or transform potential legacy render setups (using jit.gl.render)
- otherwise add the objects in documented ways to any jit.world's rendering process in Max
- specify all desired settings to the.jit.renderer~ prior to recording and subsequent rendering (i.e., changing the framerate is likely to purge all previously captured data or to distort the results) Note that since these objects are abstractions, they cannot link with an attrui nor can they respond to the universal object, for example
In Oneirotomy, the approach to render lossless hiQ video is to capture and record everything that is subject to precise timing and which operates at high priority in the scheduler before starting the step-by-step rendering process. The latter is performed offline and will ignore all realtime data while rebuilding what has been recorded. Needless to say, the rendering times may extend heavily.
- advanced knowledge of Max/MSP/Jitter
- mostly based around JavaScript codes
- dictionaries
- gen
Oneirotomy (/ɒnɪˈrɒtɔmi/; from Greek ὄνειρον, oneiron, «dream»; and /tomé/; «cut, slice») is a neologism to be translated as dream slice, where individual frames of realtime video can be sliced and reproduced in non-realtime. All objects carry names of anatomic—or related—terms pertaining to their equivalent function in a supposed offline-rendering-chain and rebuilding-process of generative patches into fluid hiQ video or image sequences with settings of choice (the «dream»).
«the» are somewhat the initials: Tim Heinze. Used to avoid conflicts between other people's abstractions and externals.
The creation of this library was inspired by Julien Bayle's Post on the Cycling '74 Forum and the current need of mine to make visual content produced with Jitter available to a complex hiQ video for a multi-layered performance. The debate about techniques to capture and render generative video content reliably and in any quality shall hopefully profit from it likewise.
- all considerations and approaches are documented in the vignettes of this package
- the reference pages can be accessed directly from the.jit.renderer~
- tutorials and tutorial patches to follow, with user feedback and individual setups this process can be accelerated and optimized
Please share your experience and ideas for development at any time. There will be plenty to discuss and optimize for sure, given that many territories of Max patching have hardly been touched upon in the course of developing this library. We shall thank you very much in advance for any input or feedback.
Tim Heinze Xenorama