Skip to content

Commit

Permalink
Merge pull request #501 from owasp-noir/issue-476
Browse files Browse the repository at this point in the history
Add AI Integration Documents
  • Loading branch information
hahwul authored Jan 19, 2025
2 parents 7e6179c + 2387036 commit 36b262a
Show file tree
Hide file tree
Showing 5 changed files with 68 additions and 9 deletions.
13 changes: 6 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,13 +30,12 @@ OWASP Noir is an open-source project specializing in identifying attack surfaces

## Key Features

- Identify API endpoints and parameters from source code.
- Support various source code languages and frameworks.
- Provide analysts with technical information and security issues identified during source code analysis.
- Friendly pipeline & DevOps integration, offering multiple output formats (JSON, YAML, OAS spec) and compatibility with tools like curl and httpie.
- Friendly Offensive Security Tools integration, allowing usage with tools such as ZAP and Caido, Burpsuite.
- Identify security issues within the source code through rule-based passive scanning.
- Generate elegant and clear output results.
- Extract API endpoints and parameters from source code.
- Support multiple languages and frameworks.
- Uncover security issues with detailed analysis and rule-based passive scanning.
- Integrate seamlessly with DevOps pipelines and tools like curl, ZAP, and Caido.
- Deliver clear, actionable results in formats like JSON, YAML, and OAS.
- Enhance endpoint discovery with AI for unfamiliar frameworks and hidden APIs.

## Usage

Expand Down
56 changes: 56 additions & 0 deletions docs/_advanced/ai_integration.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
---
title: AI Integration
has_children: false
nav_order: 5
layout: page
---

# AI Integration
{: .d-inline-block }

New (v0.19.0)
{: .label .label-green }


## Overview Flags

* `--ollama http://localhost:11434` Specify the Ollama server URL to connect to.
* `--ollama-model MODEL` Specify the Ollama model name to be used for analysis.


## How to Use AI Integration
### Step 1: Install and Run Ollama

1. Install Ollama: Follow the instructions on the official Ollama website to install the required software.
2. Run the Model: Start the Ollama server and ensure the desired model is available. For example:

```bash
# Download LLM model
ollama pull llama3

# Run LLM model
ollama run llama3
```

### Step 2: Run Noir with AI Analysis

To leverage AI capabilities for additional analysis, use the following command:

```bash
noir -b . --ollama http://localhost:11434 --ollama-model llama3
```

This command performs the standard Noir operations while utilizing the specified AI model for enhanced analysis.

![](../../images/advanced/ollama.jpeg)

## Benefits of AI Integration

* Using an LLM allows Noir to handle frameworks or languages that are beyond its original support scope.
* Additional endpoints that might be missed during a standard Noir scan can be identified.
* Note that there is a possibility of false positives, and the scanning speed may decrease depending on the number of LLM parameters and the performance of the machine hosting the service.

## Notes

* Ensure that the Ollama server is running and accessible at the specified URL before executing the command.
* Replace llama3 with the name of the desired model as required.
2 changes: 1 addition & 1 deletion docs/_advanced/diff.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Diff Mode
has_children: false
nav_order: 5
nav_order: 6
layout: page
---

Expand Down
6 changes: 5 additions & 1 deletion docs/_includes/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,10 @@ FLAGS:
--use-matchers string Send URLs that match specific conditions to the Deliver
--use-filters string Exclude URLs that match specified conditions and send the rest to Deliver

AI Integration:
--ollama http://localhost:11434 Specify the Ollama server URL
--ollama-model MODEL Specify the Ollama model name

DIFF:
--diff-path ./app2 Specify the path to the old version of the source code for comparison

Expand All @@ -51,7 +55,7 @@ FLAGS:

CONFIG:
--config-file ./config.yaml Specify the path to a configuration file in YAML format
--concurrency 100 Set concurrency
--concurrency 50 Set concurrency
--generate-completion zsh Generate Zsh/Bash/Fish completion script

DEBUG:
Expand Down
Binary file added docs/images/advanced/ollama.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 36b262a

Please sign in to comment.