Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 61 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,9 @@ Automated agents and agentic workflows (like Ralph, AutoGPT, or custom CI/CD bui

* **Directory Scanning:** Scan a single file OR an entire folder of specs (Distributed Compliance).
* **Compliance Reporting:** Generate executive summaries (`--output compliance`) with % completion metrics.
* **Code-to-Spec Verification (Drift Detection):** πŸ†•
* **Reality Checks:** Verify if claims in the Spec (e.g., "Database: Postgres") actually exist in the code (e.g., `requirements.txt`).
* **Drift Reporting:** Flag contradictions between documentation and implementation files.
* **Deep Validation:**
* **Field Patterns:** Validate specific values (e.g., "Retention: 30 days") using regex.
* **Structure:** Ensure sections contain specific subsections (`must_contain`).
Expand All @@ -31,7 +34,7 @@ Automated agents and agentic workflows (like Ralph, AutoGPT, or custom CI/CD bui
* **Gap Severity Model:** Categorizes issues as **CRITICAL**, **HIGH**, **MEDIUM**, or **LOW**.
* **SARIF Output:** Native integration with GitHub Advanced Security and GitLab Security Dashboards.
* **Exception Management:** Formalize risk acceptance using a `.nodignore` file.
* **Remote Rule Registry:** Securely fetch industry-standard rules via HTTPS with strict SSL verification.
* **Remote Rule Registry:** Securely fetch industry-standard rules via HTTPS with strict SSL verification.
* **Community Rules Library:** https://github.com/mraml/nod-rules

## **⚠️ Important Disclaimer**
Expand All @@ -53,6 +56,7 @@ Don't know what headers strict compliance requires? Let `nod` build the skeleton
```
# Generate a spec with all headers for EU AI Act, NIST, and OWASP
nod ai-spec.md --init --rules rules.yaml

```

### **2\. Build: Agentic Context Injection (`--export`)**
Expand All @@ -65,6 +69,7 @@ nod --export --rules rules.yaml

# Generate Cursor/Windsurf rules
nod --export cursor

```

### **3\. Audit: The Gatekeeper**
Expand All @@ -77,6 +82,7 @@ nod docs/ --strict --min-severity HIGH

# Generate Manager Report
nod docs/ --output compliance

```

### **4\. Maintain: Auto-Fix (`--fix`)**
Expand All @@ -85,6 +91,7 @@ Did you miss a new requirement? `nod` can append the missing sections for you.

```
nod docs/ --fix --rules rules.yaml

```

### **5\. Secure: Integrity Signing**
Expand All @@ -95,6 +102,7 @@ To verify that an audit result hasn't been tampered with, set the `NOD_SECRET_KE
export NOD_SECRET_KEY="my-secret-ci-key"
nod ai-spec.md --output json
# Output includes "signature": "a1b2c3..."

```

### **6\. Baseline: Freeze & Verify**
Expand All @@ -107,12 +115,55 @@ nod docs/ --freeze

# Verify current state against lockfile (CI/CD)
nod docs/ --verify

```

## **πŸ’‘ CLI Power Tips**

* **Registry Shorthand:** Skip manually downloading files. Use `registry:name` to fetch from the official library.

```
nod docs/ --rules registry:owasp-llm
```

*
**Silent Mode (`-q`):** Suppress banner art and success messages. Perfect for clean CI logs.

```
nod docs/ -q --strict
```

*
**File Output (`--save-to`):** Save reports directly to a file without piping.

```
nod docs/ --output sarif --save-to report.sarif
```

## **🧠 Advanced Rule Logic**

**nod** supports sophisticated rule definitions in `rules.yaml` to handle complex compliance scenarios.

### **Reality Checks (Drift Detection)**

Ensure that what is written in the Spec actually exists in the Code.

```
reality_checks:
# Check if the DB defined in Spec matches requirements.txt
- spec_pattern: "Database:\\s*(\\w+)" # Captures 'Postgres'
target_file: "requirements.txt" # Scans this file
reality_pattern: "(?i)\\1" # Looks for 'Postgres' (case-insensitive)
severity: "HIGH"

# Check if Isolation claims match Dockerfile
- spec_pattern: "Isolation:\\s*(\\w+)" # Captures 'Alpine'
target_file: "Dockerfile"
reality_pattern: "(?i)FROM.*\\1" # Looks for 'FROM ... Alpine'
severity: "CRITICAL"

```

### **Enforcement Modes**

Control *where* a requirement must appear.
Expand All @@ -121,6 +172,7 @@ Control *where* a requirement must appear.
- id: "## Data Privacy"
mode: "in_all_files" # Must exist in EVERY file scanned (e.g., footer policy)
# Default mode is "at_least_one" (Distributed compliance)

```

### **Field Validation**
Expand All @@ -132,6 +184,7 @@ Go beyond headers. Check for specific content patterns.
must_match:
- pattern: "Retention Period: \d+ (days|years)"
message: "Must specify numeric retention period"

```

### **Cross-Reference Validation**
Expand All @@ -142,6 +195,7 @@ Ensure traceabilty between documents (e.g., Threats must have Controls).
cross_references:
- source: "Threat T-(\d+)"
must_have: "Control C-\1"

```

## **βš™οΈ Configuration (`rules.yaml`)**
Expand Down Expand Up @@ -172,7 +226,7 @@ jobs:

# Run nod using the Official Action
- name: Run nod Gatekeeper
uses: mraml/nod@v2.0.0
uses: mraml/nod@v2.1.0
with:
target: 'docs/'
rules: 'rules.yaml'
Expand All @@ -187,11 +241,12 @@ jobs:
if: always()
with:
sarif_file: nod-results.sarif

```

## **🀝 Contributing**

We welcome contributions\! Please see [CONTRIBUTING.md](https://github.com/mraml/nod/blob/main/CONTRIBUTING.md) for details on how to add new rules or features.
We welcome contributions\! Please see [CONTRIBUTING.md](https://www.google.com/search?q=CONTRIBUTING.md) for details on how to add new rules or features.

If you find **nod** useful for your organization, please consider **starring the repository** to help others find it.

Expand All @@ -201,6 +256,7 @@ Add this to your `README.md` to show if your specs are currently passing the gat

```
![Nod Gatekeeper](https://github.com/<username>/<repo>/actions/workflows/nod-gatekeeper.yml/badge.svg)

```

## **πŸ€– Transparency**
Expand All @@ -216,3 +272,5 @@ Apache 2.0





12 changes: 6 additions & 6 deletions src/nod/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
import json
from .config import load_rules, load_ignore
from .scanner import Scanner, SEVERITY_MAP
from .generator import gen_template, gen_context, apply_fix
from .generator import gen_template, gen_context, gen_schema, apply_fix
from .reporters import gen_sarif, gen_report
from .security import sign_attestation, freeze, verify
from .utils import Colors, colorize
Expand All @@ -17,6 +17,7 @@ def main():
parser.add_argument("--init", action="store_true")
parser.add_argument("--fix", action="store_true")
parser.add_argument("--export", nargs="?", const="context", choices=["context", "cursor", "windsurf"], help="Export context/rules")
parser.add_argument("--export-schema", action="store_true", help="Export active rules as JSON Schema")
parser.add_argument("--strict", action="store_true")
parser.add_argument("--freeze", action="store_true")
parser.add_argument("--verify", action="store_true")
Expand All @@ -35,6 +36,10 @@ def main():
policy_version = config.get("version", "unknown")
ignored = load_ignore(".nodignore")

if args.export_schema:
print(gen_schema(config, policy_version))
sys.exit(0)

if args.export:
print(gen_context(config, policy_version, ignored, args.export))
sys.exit(0)
Expand Down Expand Up @@ -101,8 +106,6 @@ def main():
min_val = SEVERITY_MAP.get(args.min_severity, 0)

for data in results.values():
# In quiet mode, skip profile headers unless there's a failure inside?
# Or just print failures. Let's print failures only in quiet mode.
profile_buffer = []
if not args.quiet:
profile_buffer.append(f"\n[{colorize(data['label'], Colors.BOLD)}]")
Expand All @@ -127,7 +130,6 @@ def main():
else:
profile_buffer.append(f" {colorize('βœ…', Colors.GREEN)} [PASS] {name}")

# In quiet mode, only append buffer if there were failures
if not args.quiet or has_failures:
summary.extend(profile_buffer)

Expand All @@ -141,7 +143,6 @@ def main():

output_content = "\n".join(summary)

# Check exit code based on severity for non-text outputs too
if SEVERITY_MAP.get(max_sev_label, 0) >= SEVERITY_MAP.get(args.min_severity, 0):
exit_code = 1

Expand All @@ -155,7 +156,6 @@ def main():
print(f"Error saving file: {e}", file=sys.stderr)
sys.exit(1)
else:
# Only print if there is content (quiet mode with no errors might be empty)
if output_content.strip():
print(output_content)

Expand Down
36 changes: 36 additions & 0 deletions src/nod/generator.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import sys
import json
from typing import Dict, Any
from .utils import clean_header

Expand Down Expand Up @@ -49,6 +50,41 @@ def gen_context(config: Dict[str, Any], policy_version: str, ignored: list, fmt:

return "\n".join(lines)

def gen_schema(config: Dict[str, Any], policy_version: str) -> str:
"""Generates a JSON Schema representation of the active policy."""
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "nod Compliance Policy",
"version": policy_version,
"description": "Active compliance rules loaded by nod.",
"type": "object",
"properties": {
"profiles": {
"type": "object",
"properties": {}
}
}
}

for name, data in config.get("profiles", {}).items():
profile_schema = {
"type": "object",
"description": data.get('badge_label', name),
"properties": {
"requirements": {"type": "array", "items": {"type": "string"}},
"red_flags": {"type": "array", "items": {"type": "string"}}
}
}

# Populate specific requirements as enums/examples
reqs = [r.get("label") or clean_header(r['id']) for r in data.get("requirements", [])]
if reqs:
profile_schema["properties"]["requirements"]["examples"] = reqs

schema["properties"]["profiles"]["properties"][name] = profile_schema

return json.dumps(schema, indent=2)

def apply_fix(path: str, results: Dict[str, Any]) -> None:
"""Appends missing sections to the target file."""
# Logic to determine target file (if directory passed, create nod-compliance.md)
Expand Down
19 changes: 19 additions & 0 deletions src/nod/reporters.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,8 @@ def gen_sarif(attestation: Dict[str, Any], path: str) -> Dict[str, Any]:
if c.get("control_id"):
props["compliance-ref"] = c["control_id"]
props["security-severity"] = SARIF_SCORE_MAP.get(c["severity"], "1.0")
if c.get("type") == "contradiction":
props["tags"] = ["drift", "spec-contradiction"]

desc = c.get("label") or rule_id
rules.append({
Expand Down Expand Up @@ -82,11 +84,19 @@ def gen_sarif(attestation: Dict[str, Any], path: str) -> Dict[str, Any]:
def gen_report(attestation: Dict[str, Any]) -> str:
"""Generates a human-readable text report."""
out = []
contradictions = []

for data in attestation["results"].values():
chks = data.get("checks", [])
pct = int((len([c for c in chks if c["status"] != "FAIL"]) / len(chks) * 100) if chks else 0)
out.append(f"{data['label']} Report ({datetime.utcnow().strftime('%Y-%m-%d')})\nStatus: {pct}% Compliant\n")

for c in chks:
# Separate Contradictions/Drift for special section
if c.get("type") == "contradiction" and c["status"] == "FAIL":
contradictions.append(f"⚠️ {c['remediation']} (Line {c.get('line')} in {c.get('source')})")
continue

icon = {"FAIL": "❌", "EXCEPTION": "βšͺ", "SKIPPED": "⏭️"}.get(c["status"], "βœ…")
ref = c.get("article") or c.get("control_id")
name = c.get("label") or clean_header(c['id'])
Expand All @@ -98,4 +108,13 @@ def gen_report(attestation: Dict[str, Any]) -> str:
out.append(f" Ev: {c['source']}:{c.get('line')}")
out.append("")
out.append("-" * 40)

# Append Drift Report if contradictions found
if contradictions:
out.append("\n" + "="*40)
out.append("πŸ“Š POTENTIAL CODE CONTRADICTIONS (DRIFT)")
out.append("="*40)
out.extend(contradictions)
out.append("")

return "\n".join(out)
Loading