LLM Event Digest is a Python package designed to process news headlines or short text inputs and generate structured summaries of events, such as service disruptions or incidents. Utilizing a language model, it extracts key details like the involved company, the nature of the disruption, and the cause, ensuring outputs conform to a predefined format for consistency and reliability. This tool is ideal for automated news monitoring, alert systems, or data aggregation where structured, error-free information extraction from text is required.
Install the package via pip:
pip install llm_event_digestHere's an example of how to use the package in Python:
from llm_event_digest import llm_event_digest
response = llm_event_digest(
user_input="The internet service in downtown was down for 3 hours caused by a fiber cut.",
api_key="your-llm7-api-key" # Optional, if not set in environment variables
)
print(response)user_input(str): The text input (news headline or short description) to process.llm(Optional[BaseChatModel]): An optional LangChain language model instance. If not provided, the defaultChatLLM7is used.api_key(Optional[str]): API key for LLM7. If not provided, it looks for theLLM7_API_KEYenvironment variable.
The package uses ChatLLM7 from langchain_llm7 by default.
You can also pass your own LLM instance, such as:
from langchain_openai import ChatOpenAI
llm = ChatOpenAI()
response = llm_event_digest(
user_input="Network outage in the city center.",
llm=llm
)Or:
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic()
response = llm_event_digest(
user_input="Server downtime due to maintenance.",
llm=llm
)And:
from langchain_google_genai import ChatGoogleGenerativeAI
llm = ChatGoogleGenerativeAI()
response = llm_event_digest(
user_input="Scheduled power outage.",
llm=llm
)Default rate limits for LLM7 free tier are suitable for most use cases. For higher usage, obtain an API key from https://token.llm7.io/ and pass it via environment variable LLM7_API_KEY or directly in the function call.
If you encounter any issues or have questions, please open an issue on the GitHub repository: https://github.com/chigwell/llm-event-digest/issues
Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell