The Compound AI System that can generate Scenic programs from Crash Report descriptions.
At the moment we only support driving scenarios for CARLA but this is easily extendable as Scenic supports other simulators.
To install and run, first select which LLM backbone you would like to use. Today we support GPT from OpenAI, Claude from Anthropic, and Open Source Models (i.e. Llama family) that are run locally.
OpenAI API key if you have not already:
- Login to OpenAI account and go to
https://platform.openai.com/account/api-keys
. - Create an API key.
- Now open a shell: on Windows run
set OPENAI_API_KEY=<your-key>
, and on Unix runexport OPENAI_API_KEY=<your-key>
. We will also need an organization id so alsoOPENAI_ORGANIZATION=<your-org-id>
orexport OPENAI_ORGANIZATION=<your-org-id>
One good option for a local model, is the Mixtral Mixture of Experts model from Mistral AI.
While you can run mixtral just fine through llama.cpp
, we found in our experience, that it's just a bit faster
to use Mozilla's Llamafile. Our code does not make any assumption
on which local model you choose to run from there but we found the Mixtral-8x7B-Instruct works great for our use case.
Once you have picked which model to download from there list of supported models, follow the first 4 step's
quick start guide to almost get running.
For step 5 when you are about to run the llamafile
, also add the arguments: --server -np 10
where np
is the
number of server threads (so 10 in this case) which should be greater than or equal to the number of worker threads you will use
in ScenicNL.
Full example command:
./mixtral-8x7b-instruct-v0.1.Q5_K_M.llamafile --server -np 10
conda create -n scenicNL python=3.11
conda activate scenicNL
pip install -e .
An example command to running our pipeline
gen_scenic --query_path <path-to-descriptions> --output-path <path-to-output> --model gpt-3.5-turbo-0613 --llm_prompt_type predict_few_shot
This project is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. You can view the full license terms here.
© [2024] [Karim Elmaaroufi]