Skip to content

Query Node

kleer001 edited this page Feb 6, 2025 · 2 revisions

Query Node

Interfaces with Large Language Models (LLMs) to process text prompts and generate responses. Acts as a bridge between the node graph system and local LLM installations, enabling prompt-based text generation and processing. Supports both single and batch prompt processing, with options to limit processing for development or resource management.

Key Features

LLM Integration

  • Automatically detects and connects to local LLM installations
  • Supports dynamic LLM selection and switching
  • Provides fallback mechanisms when preferred LLM is unavailable

Prompt Processing

  • Handles both single and batch prompt processing
  • Maintains response history
  • Supports forced response regeneration
  • Provides clean, formatted LLM responses

Parameters

limit (bool)

When True, restricts processing to only the first prompt. Useful for testing or managing resource usage.

response (List[str])

Stores the history of LLM responses. Updated after each successful processing.

llm_name (str)

Identifier for the target LLM (e.g., "Ollama"). Defaults to "Ollama" but can be auto-detected.

find_llm (button)

Triggers automatic LLM detection and updates llm_name with found installation.

respond (button)

Forces reprocessing of current prompts and updates responses regardless of cache.

Clone this wiki locally