Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions .github/workflows/docker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,19 @@ jobs:
uses: docker/setup-buildx-action@v3
if: github.event_name != 'schedule' && github.event_name != 'workflow_dispatch'

- name: Get version info
id: version
if: github.event_name != 'schedule' && github.event_name != 'workflow_dispatch'
run: |
if [[ "${{ github.ref }}" == refs/tags/v* ]]; then
VERSION="${{ github.ref_name }}"
VERSION="${VERSION#v}"
else
VERSION="dev"
fi
echo "version=${VERSION}" >> $GITHUB_OUTPUT
echo "commit=${{ github.sha }}" >> $GITHUB_OUTPUT

- name: Docker meta
id: docker-meta
if: github.event_name != 'schedule' && github.event_name != 'workflow_dispatch'
Expand All @@ -95,6 +108,9 @@ jobs:
file: Dockerfile
load: true
tags: ${{ env.REPO }}:scan
build-args: |
VERSION=${{ steps.version.outputs.version }}
COMMIT=${{ steps.version.outputs.commit }}
cache-from: type=gha
cache-to: type=gha,mode=max

Expand Down Expand Up @@ -132,5 +148,8 @@ jobs:
push: true
tags: ${{ steps.docker-meta.outputs.tags }}
labels: ${{ steps.docker-meta.outputs.labels }}
build-args: |
VERSION=${{ steps.version.outputs.version }}
COMMIT=${{ steps.version.outputs.commit }}
cache-from: type=gha
cache-to: type=gha,mode=max
10 changes: 8 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
FROM golang:1.25-alpine AS builder

# Build arguments for version injection
ARG VERSION=dev
ARG COMMIT=unknown

WORKDIR /app

# Copy go mod files
Expand All @@ -11,8 +15,10 @@ RUN go mod download
# Copy source code
COPY . .

# Build the application
RUN CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo -o llm-action .
# Build the application with version information
RUN CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo \
-ldflags "-s -w -X main.Version=${VERSION} -X main.Commit=${COMMIT}" \
-o llm-action .

# Final stage
FROM alpine:3.22
Expand Down
90 changes: 80 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,12 @@ A GitHub Action to interact with OpenAI-compatible LLM services, supporting cust
- [Using with Ollama](#using-with-ollama)
- [Chain Multiple LLM Calls](#chain-multiple-llm-calls)
- [Debug Mode](#debug-mode)
- [Custom HTTP Headers](#custom-http-headers)
- [Default Headers](#default-headers)
- [Custom Headers](#custom-headers)
- [Single Line Format](#single-line-format)
- [Multiline Format](#multiline-format)
- [Headers with Custom Authentication](#headers-with-custom-authentication)
- [Supported Services](#supported-services)
- [Security Considerations](#security-considerations)
- [License](#license)
Expand All @@ -62,6 +68,7 @@ A GitHub Action to interact with OpenAI-compatible LLM services, supporting cust
- 🐛 Debug mode with secure API key masking
- 🎨 Go template support for dynamic prompts with environment variables
- 🛠️ Structured output via function calling (tool schema support)
- 📋 Custom HTTP headers support for log analysis and custom authentication

## Inputs

Expand All @@ -78,13 +85,14 @@ A GitHub Action to interact with OpenAI-compatible LLM services, supporting cust
| `temperature` | Temperature for response randomness (0.0-2.0) | No | `0.7` |
| `max_tokens` | Maximum tokens in the response | No | `1000` |
| `debug` | Enable debug mode to print all parameters (API key will be masked) | No | `false` |
| `headers` | Custom HTTP headers for API requests. Format: `Header1:Value1,Header2:Value2` or multiline | No | `''` |

## Outputs

| Output | Description |
| ------------ | ------------------------------------------------------------------------------------------------- |
| `response` | The raw response from the LLM (always available) |
| `<field>` | When using tool_schema, each field from the function arguments JSON becomes a separate output |
| Output | Description |
| ---------- | --------------------------------------------------------------------------------------------- |
| `response` | The raw response from the LLM (always available) |
| `<field>` | When using tool_schema, each field from the function arguments JSON becomes a separate output |

**Output Behavior:**

Expand Down Expand Up @@ -142,7 +150,7 @@ uses: appleboy/LLM-action@main

### With System Prompt

```yaml
````yaml
Copy link

Copilot AI Dec 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code fence markers should use triple backticks (```) instead of quadruple backticks (````). Quadruple backticks are non-standard and may not render correctly in Markdown.

Copilot uses AI. Check for mistakes.
- name: Code Review with LLM
id: review
uses: appleboy/LLM-action@v1
Expand All @@ -162,7 +170,7 @@ uses: appleboy/LLM-action@main
- name: Post Review Comment
run: |
echo "${{ steps.review.outputs.response }}"
```
````
Copy link

Copilot AI Dec 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code fence markers should use triple backticks (```) instead of quadruple backticks (````). Quadruple backticks are non-standard and may not render correctly in Markdown.

Copilot uses AI. Check for mistakes.

### With Multiline System Prompt

Expand Down Expand Up @@ -194,7 +202,7 @@ uses: appleboy/LLM-action@main

Instead of embedding long prompts in YAML, you can load them from a file:

```yaml
````yaml
Copy link

Copilot AI Dec 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code fence markers should use triple backticks (```) instead of quadruple backticks (````). Quadruple backticks are non-standard and may not render correctly in Markdown.

Copilot uses AI. Check for mistakes.
- name: Code Review with Prompt File
id: review
uses: appleboy/LLM-action@v1
Expand All @@ -208,7 +216,7 @@ Instead of embedding long prompts in YAML, you can load them from a file:
def calculate(x, y):
return x / y
```
```
````
Copy link

Copilot AI Dec 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code fence markers should use triple backticks (```) instead of quadruple backticks (````). Quadruple backticks are non-standard and may not render correctly in Markdown.

Copilot uses AI. Check for mistakes.

Or using `file://` prefix:

Expand Down Expand Up @@ -417,7 +425,7 @@ Use `tool_schema` to get structured JSON output from the LLM using function call

#### Code Review with Structured Output

```yaml
````yaml
Copy link

Copilot AI Dec 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code fence markers should use triple backticks (```) instead of quadruple backticks (````). Quadruple backticks are non-standard and may not render correctly in Markdown.

Copilot uses AI. Check for mistakes.
- name: Structured Code Review
id: review
uses: appleboy/LLM-action@v1
Expand Down Expand Up @@ -466,7 +474,7 @@ Use `tool_schema` to get structured JSON output from the LLM using function call
echo "Score: $SCORE"
echo "Issues: $ISSUES"
echo "Suggestions: $SUGGESTIONS"
```
````
Copy link

Copilot AI Dec 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code fence markers should use triple backticks (```) instead of quadruple backticks (````). Quadruple backticks are non-standard and may not render correctly in Markdown.

Copilot uses AI. Check for mistakes.

**Why use environment variables instead of direct interpolation?**

Expand Down Expand Up @@ -708,6 +716,68 @@ main.Config{

**Security Note:** When debug mode is enabled, the API key is automatically masked (only showing first 4 and last 4 characters) to prevent accidental exposure in logs.

### Custom HTTP Headers

#### Default Headers

Every API request automatically includes the following headers for identification and log analysis:

| Header | Value | Description |
| ------------------ | ---------------------- | --------------------------------------- |
| `User-Agent` | `LLM-action/{version}` | Standard User-Agent with action version |
| `X-Action-Name` | `appleboy/LLM-action` | Full name of the GitHub Action |
| `X-Action-Version` | `{version}` | Semantic version of the action |

These headers help you identify requests from this action in your LLM service logs.

#### Custom Headers

Use the `headers` input to add custom HTTP headers to API requests. This is useful for:

- Adding request tracking IDs for log analysis
- Custom authentication headers
- Passing metadata to your LLM service

#### Single Line Format

```yaml
- name: Call LLM with Custom Headers
uses: appleboy/LLM-action@v1
with:
api_key: ${{ secrets.OPENAI_API_KEY }}
input_prompt: "Hello, world!"
headers: "X-Request-ID:${{ github.run_id }},X-Trace-ID:${{ github.sha }}"
```

#### Multiline Format

```yaml
- name: Call LLM with Multiple Headers
uses: appleboy/LLM-action@v1
with:
api_key: ${{ secrets.OPENAI_API_KEY }}
input_prompt: "Analyze this code"
headers: |
X-Request-ID:${{ github.run_id }}
X-Trace-ID:${{ github.sha }}
X-Environment:production
X-Repository:${{ github.repository }}
```

#### Headers with Custom Authentication

```yaml
- name: Call Custom LLM Service
uses: appleboy/LLM-action@v1
with:
base_url: "https://your-llm-service.com/v1"
api_key: ${{ secrets.LLM_API_KEY }}
input_prompt: "Generate a summary"
headers: |
X-Custom-Auth:${{ secrets.CUSTOM_AUTH_TOKEN }}
X-Tenant-ID:my-tenant
```
Comment on lines +769 to +779
Copy link

Copilot AI Dec 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This example shows putting a secret token into headers via X-Custom-Auth:${{ secrets.CUSTOM_AUTH_TOKEN }}, but the implementation’s debug mode dumps the entire Config (including Headers) with godump.Dump and only masks api_key. If a workflow follows this pattern and enables debug: true, the custom auth token will be written in cleartext to GitHub Actions logs, exposing credentials to anyone with log access. Consider redacting header values in debug output (or clearly documenting that headers should not contain secrets when debug is enabled), similar to how api_key is masked.

Copilot uses AI. Check for mistakes.

## Supported Services

This action works with any OpenAI-compatible API, including:
Expand Down
Loading
Loading