Skip to content

intercepted16/py-promptkit

Repository files navigation

PromptKit

"wrote" (more like AI) this cause I was making a RAG project and I just wanted to cleanly put all my prompts in a config file. I don't even know what to call it.. looks clean though, and you can define MCP tools in it. Ik, sounds boring, right? Yeah.. I'm just using it internally.

Installation

pip install py_promptkit

there ya go! prefix with whatever you like

usage

Create prompts.toml:

[models]
chat = "gpt-4o-mini"

[providers]
chat = "openai"

[temperatures]
chat = 0.7

[chat]
template = "You are a helpful assistant. {user_message}"

Use it:

from py_promptkit import PromptLoader, PromptRunner
from py_promptkit.litellm.core import LiteLLMClient

loader = PromptLoader("prompts.toml")
loader.load()

with PromptRunner(loader) as runner:
    runner.register_client("openai", LiteLLMClient(secrets={"OPENAI_API_KEY": "sk-..."}))
    result = runner.run("chat", {"user_message": "explain quantum computing"})
    print(result["output"])

Stream Responses

for chunk in runner.run_stream("chat", {"user_message": "..."}):
    print(chunk, end="", flush=True)

About

Run all LLM providers, manage config, MCP in a small TOML file

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors