"wrote" (more like AI) this cause I was making a RAG project and I just wanted to cleanly put all my prompts in a config file. I don't even know what to call it.. looks clean though, and you can define MCP tools in it. Ik, sounds boring, right? Yeah.. I'm just using it internally.
pip install py_promptkitthere ya go! prefix with whatever you like
Create prompts.toml:
[models]
chat = "gpt-4o-mini"
[providers]
chat = "openai"
[temperatures]
chat = 0.7
[chat]
template = "You are a helpful assistant. {user_message}"Use it:
from py_promptkit import PromptLoader, PromptRunner
from py_promptkit.litellm.core import LiteLLMClient
loader = PromptLoader("prompts.toml")
loader.load()
with PromptRunner(loader) as runner:
runner.register_client("openai", LiteLLMClient(secrets={"OPENAI_API_KEY": "sk-..."}))
result = runner.run("chat", {"user_message": "explain quantum computing"})
print(result["output"])for chunk in runner.run_stream("chat", {"user_message": "..."}):
print(chunk, end="", flush=True)