promptify_summary is a lightweight Python package that turns arbitrary text
(e.g. video titles, headlines, or any user‑supplied string) into concise,
structured summaries using a large language model.
The package relies on pattern matching to guarantee consistent, predictable
output regardless of the LLM provider. It works with the default
ChatLLM7 backend out of the box while also allowing you to plug in any
langchain compatible LLM.
pip install promptify_summary# Basic usage with the default LLM7 backend
from promptify_summary import promptify_summary
user_input = "Learn how to deploy a Docker container in 5 minutes!"
summary = promptify_summary(user_input)
print(summary)
# >>> ['Deploy Docker Container', '5 minutes', ...] # Example outputYou can swap the default ChatLLM7 for any LangChain LLM.
Below are examples with OpenAI, Anthropic, and Google Generative AI.
from langchain_openai import ChatOpenAI
from promptify_summary import promptify_summary
llm = ChatOpenAI() # Uses default OpenAI key in environment
user_input = "How to tune a PostgreSQL database?"
summary = promptify_summary(user_input, llm=llm)from langchain_anthropic import ChatAnthropic
from promptify_summary import promptify_summary
llm = ChatAnthropic() # Uses your Anthropic key in environment
summary = promptify_summary("Explain quantum entanglement.", llm=llm)from langchain_google_genai import ChatGoogleGenerativeAI
from promptify_summary import promptify_summary
llm = ChatGoogleGenerativeAI() # Uses your Google key in environment
summary = promptify_summary("What is Python 3.11?", llm=llm)promptify_summary(
user_input: str,
api_key: Optional[str] = None,
llm: Optional[BaseChatModel] = None
) -> List[str]| Parameter | Type | Description |
|---|---|---|
user_input |
str |
Text to summarize. |
llm |
Optional[BaseChatModel] |
Custom LangChain LLM instance. If None, ChatLLM7 is used. |
api_key |
Optional[str] |
LLM7 API key. If omitted, the package looks for the LLM7_API_KEY environment variable, falling back to a placeholder key. |
The function returns a list of strings extracted from the LLM’s response that match the internal regular‑expression pattern, ensuring output consistency.
The default ChatLLM7 backend is accessed via the
langchain_llm7 package.
If you want to change the API key (for higher rate limits or a different
account) provide the key directly or set the environment variable:
export LLM7_API_KEY="your_free_or_paid_api_key"Or pass it programmatically:
summary = promptify_summary("Sample text", api_key="your_api_key")You can obtain a free key by registering at https://token.llm7.io/.
The LLM7 free tier rate limits are sufficient for most casual or small‑scale projects. For more intensive workloads, consider upgrading your LLM7 account to increase limits or supply your own LLM.
- Author: Eugene Evstafev
- Email: hi@euegne.plus
- GitHub: chigwell
For issues, feature requests, or questions, open an issue on our GitHub repository.
Enjoy automating structured content generation with promptify-summary!