Skip to content

A new package is designed to streamline the interaction with large language models like Claude Code via OpenRouter by accepting user prompts in natural language, processing them through structured mes

Notifications You must be signed in to change notification settings

chigwell/routerflu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

routerflu – Structured LLM Response Extractor

PyPI version License: MIT Downloads LinkedIn

Streamline interactions with large language models (LLMs) like Claude via OpenRouter by processing natural language inputs into structured, pattern-matched outputs. routerflu ensures consistent, extractable responses for programming, data querying, or content creation tasks.


📌 Key Features

Pattern-Matched Outputs – Forces LLM responses to follow strict regex patterns for reliability. ✅ Flexible LLM Integration – Works with LLM7 (default), OpenAI, Anthropic, Google, or any BaseChatModel. ✅ Environment-Aware – Uses LLM7_API_KEY from env vars or accepts direct API keys. ✅ Minimal Dependencies – Built on langchain and llmatch_messages.


🚀 Installation

pip install routerflu

🔧 Usage Examples

1. Basic Usage (Default: LLM7)

from routerflu import routerflu

response = routerflu(
    user_input="Write a Python function to reverse a string."
)
print(response)  # Structured output matching predefined patterns

2. Custom LLM Integration

OpenAI

from langchain_openai import ChatOpenAI
from routerflu import routerflu

llm = ChatOpenAI()
response = routerflu(user_input="Explain how REST APIs work.", llm=llm)

Anthropic (Claude)

from langchain_anthropic import ChatAnthropic
from routerflu import routerflu

llm = ChatAnthropic()
response = routerflu(user_input="Debug this SQL query.", llm=llm)

Google Vertex AI

from langchain_google_genai import ChatGoogleGenerativeAI
from routerflu import routerflu

llm = ChatGoogleGenerativeAI()
response = routerflu(user_input="Summarize this document.", llm=llm)

🔑 Configuration

API Key

  • Default: Uses LLM7_API_KEY from environment variables.
  • Manual Override:
    routerflu(user_input="...", api_key="your_llm7_api_key")
  • Get a Free Key: LLM7 Token Registration

Rate Limits

  • LLM7 Free Tier: Sufficient for most use cases.
  • Upgrade: Use a custom API key or switch to a paid plan.

📦 Dependencies

  • langchain-core (for BaseChatModel)
  • llmatch_messages (for pattern extraction)
  • langchain_llm7 (default LLM provider)

📝 Function Signature

routerflu(
    user_input: str,
    api_key: Optional[str] = None,
    llm: Optional[BaseChatModel] = None
) -> List[str]
  • user_input (str): Natural language prompt for the LLM.
  • api_key (Optional[str]): LLM7 API key (falls back to env var LLM7_API_KEY).
  • llm (Optional[BaseChatModel]): Custom LLM (e.g., ChatOpenAI, ChatAnthropic).

🔄 How It Works

  1. System Prompt: Guides the LLM to format responses strictly.
  2. Pattern Matching: Uses regex to extract structured data from responses.
  3. Error Handling: Raises RuntimeError if LLM fails to comply.

📜 License

MIT


📢 Support & Issues


Releases

No releases published

Packages

No packages published

Languages