Skip to content

MrScripty/Studio-Whip

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

2026 Update! This project will be undergoing an overhaul after a release is made for the Linux-CoumfyUI-Launcher repo. There have been significant tech developments in the last year since Studio Whip began that will be bringing major improvements to the entirety of the project.

Studio-Whip

AI-enhanced collaborative content production suite for movies, comics, and interactive visual novels. Create compelling stories with seamlessly integrated tools designed to enhance your creative talents.

There is a mostly incomplete website in construction here, and you can see recent dev proposals/activity in the GitHub issues.

Major Planned Features

  • Story-Driven Platform: Create original screenplays with visual story-building tools and advanced LLM integrations.
  • P2P Real-Time Collaboration: Create together remotely for free, without requiring third-party servers.
  • Storyboarding: Generate images, draw sketches, and position 3D assets to create dynamic animatic storyboards linked to your script.
  • Audio Editing: Sequence generated/recorded dialogue, music, and SFX within your timelines.
  • Video Production: Develop storyboards into rendered scenes using integrated image and video generation models.
  • Professional Color Grading: Make expressive color choices in a color-managed environment using scopes, primary/secondary adjustments, and AI-assisted tools.
  • Node-Based Compositing: Combine multiple visual elements (renders, footage, effects) into final shots using a flexible node graph system.

System Requirments

System requirements heavily depend on the size and type of AI models you choose to run locally. You can find many models on platforms like Hugging Face and Civitai.

The table below summarizes recommended hardware specifications for different tiers of usage, focusing on local inference:

  • AI Performance (AI TOPS) is measured using FP8 precision.
  • CPU performance estimates use PassMark CPU Mark scores.
  • Storage estimates are minimums for the Studio-Whip base install, and some models. Your actual needs will be higher depending on the number and size of models and project assets. NVMe SSDs are highly recommended.
Tier Use Case RAM VRAM AI TOPS Storage CPU Performance
Entry-Level Development, Testing 32GB 8GB 250 32GB 20K+
Mid-Range Education, Personal Projects 32GB 16GB 500 128GB 30K+
High-End Advanced Projects, Video 64GB 24GB+ 1000 512GB 50k+
Enterprise Fast Generation 128GB 96GB+ 4000 1TB 80K+

Example Model Configurations per Tier

The following table provides example model combinations suitable for each hardware tier when running locally. These are just suggestions; you can:

  • Mix and match models based on your specific tasks (writing, image gen, video gen, etc...).
  • Use fewer, larger models or more, smaller models depending on VRAM/RAM.
  • Choose models optimized for specific hardware (e.g., INT4/FP8 quantizations if supported).
  • Combine local models with cloud APIs.
  • Distribute models accross CPU and GPU

Hover over models for license

Tier Creative Writing Instruct Image Generation Video Generation
Entry-Level Use the Instruct model Not Practical
Mid-Range
High-End
Enterprise

How To Build From Source

Requirements (All Platforms)

  • Vulkan SDK: 1.3 or later
  • Rust: Latest stable version (via Rustup)
  • A GPU Nvidia is sugested due to compatibility and inferance performance, but not strictly required

After installing requirments

  1. Clone the repository: git clone https://github.com/<your-repo>/studio-whip.git
  2. Navigate to the project: cd studio-whip/rust
  3. Build: cargo run --release

For Windows Users

Install Windows Subsystem for Linux, this allows you to run the linux shell script utlities located into /rust/utilities

  1. Open Powershell as admin and install wsl wsl --install
  2. Find availible linux distros wsl --list --online
  3. Install the latest Ubuntu LTS wsl --install --<distro>
  4. Launch the Linux distribution: Win+R Ubuntu
  5. Windows paths in Ubutu are located in /mnt/<lowercase-drive-letter>/*
  6. You may need to install dos2unix within your Linux environment to convert windows line endings.
    • Install it: sudo apt update && sudo apt install dos2unix
    • example usage : dos2unix llm_prompt_tool.sh

After installing the Linux Subsystem, add the Vulkan SDK's glslc compiler to your system variables:

  1. Press Win + R, type SystemPropertiesAdvanced, and click Environment Variables.
  2. Under "System Variables" or "User Variables," select Path and click Edit.
  3. Click New and add: C:\VulkanSDK\<version>\Bin (replace <version> with your installed version).
  4. Click OK to save.
  5. Verify with glslc --version in PowerShell. It should output the compiler version.

Contributing

Check out the architecture overview, modules documentation, Roadmap, and prompt_tool.sh to get started.

Is This Production Ready?

No. This is a complex early development software with partial and unimplemented features. It will take at least a year to enter plausable production use.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published