Skip to content

Commit

Permalink
Add support and tests for individual AOAI keys (#33)
Browse files Browse the repository at this point in the history
* Add support and tests for individual AOAI keys

Fixes #32

* Externalize config to config.py
  • Loading branch information
simonkurtz-MSFT authored Jun 4, 2024
1 parent e37dd8c commit 9fcd775
Show file tree
Hide file tree
Showing 6 changed files with 361 additions and 76 deletions.
66 changes: 64 additions & 2 deletions PACKAGE_README.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,9 @@ from openai_priority_loadbalancer import AsyncLoadBalancer, Backend
*Importing `httpx` lets us use `httpx.Client` and `httpx.AsyncClient`. This avoids having to update openai to at least
[1.17.0](https://github.com/openai/openai-python/releases/tag/v1.17.0). The `openai` properties for `DefaultHttpxClient` and `DefaultAsyncHttpxClient` are mere wrappers for `httpx.Client` and `httpx.AsyncClient`.*

### Configuring the Backends and Load Balancer
### Configuring the Backends and Load Balancer with a Token Provider

**We strongly recommend the use of a managed identity in Azure and the use of the `AzureDefaultCredential` locally.** This section details that approach.

1. Define a list of backends according to the *Load Balancer Backend Configuration* section below.

Expand All @@ -88,8 +90,8 @@ from openai_priority_loadbalancer import AsyncLoadBalancer, Backend
```python
backends: List[Backend] = [
Backend("oai-eastus.openai.azure.com", 1),
Backend("oai-eastus.openai.azure.com", 1, "/ai"),
Backend("oai-southcentralus.openai.azure.com", 1)
Backend("oai-westus.openai.azure.com", 1, "/ai"),
]
```

Expand Down Expand Up @@ -121,6 +123,53 @@ from openai_priority_loadbalancer import AsyncLoadBalancer, Backend
)
```

### Configuring the Backends and Load Balancer with individual Azure OpenAI API Keys

It's best to avoid using the Azure OpenAI instances' keys as that could a) accidentally leave credentials in your source code, and b) the keys are different for each instance, requiring maintenance, environment-specific keys, key rotations, etc.
However, if you do need to use keys, it is possible to set them for each Azure OpenAI backend starting with release `1.1.0`.

When a backend's `api_key` property is set, the `api-key` header will be replaced with the `<api_key>` value prior to sending the request to the corresponding Azure OpenAI instance. Please see below for examples.

1. Define a list of backends according to the *Load Balancer Backend Configuration* section below. This includes the API key as the last parameter (below values are mock placeholders).

*Optionally, a path can be added (e.g. `"/ai"`), which gets prepended to the request path. This is an extraordinary, not a commonly needed functionality.*

```python
backends: List[Backend] = [
Backends("oai-eastus.openai.azure.com", 1, None, 'c3d116584360f9960b38cccc5f44caba'),
Backends("oai-southcentralus.openai.azure.com", 1, None, '21c14252762502e8fc78b61e21db114f'),
Backends("oai-westus.openai.azure.com", 1, "/ai", 'd6370785453b2b9c331a94cb1b7aaa36')
]
```

1. Instantiate the load balancer and inject a new httpx client with the load balancer as the new transport.

**Synchronous**

```python
lb = LoadBalancer(backends)

client = AzureOpenAI(
azure_endpoint = f"https://{backends[0].host}", # Must be seeded, so we use the first host. It will get overwritten by the load balancer.
api_key = "obtain_from_load_balancer", # the value is not used, but it must be set
api_version = "2024-04-01-preview",
http_client = httpx.Client(transport = lb) # Inject the synchronous load balancer as the transport in a new default httpx client.
)
```

**Asynchronous**

```python
lb = AsyncLoadBalancer(backends)

client = AsyncAzureOpenAI(
azure_endpoint = f"https://{backends[0].host}", # Must be seeded, so we use the first host. It will get overwritten by the load balancer.
api_key = "obtain_from_load_balancer", # the value is not used, but it must be set
api_version = "2024-04-01-preview",
http_client = httpx.AsyncClient(transport = lb) # Inject the asynchronous load balancer as the transport in a new default async httpx client.
)
```

## Load Balancer Backend Configuration

At its core, the Load Balancer Backend configuration requires one or more backend hosts and a numeric priority starting at 1. Please take note that you define a host, not a URL.
Expand Down Expand Up @@ -191,6 +240,19 @@ backends = [
]
```

### Backend Authentication

While we strongly recommend the use of managed identities, it is possible to use the Azure OpenAI API keys for each respective Azure OpenAI instance. Note that you are solely responsible for the safeguarding and injection of these keys.

```python
# Define the backends and their priority
backends = [
Backends("oai-eastus-xxxxxxxx.openai.azure.com", 1, None, 'c3d116584360f9960b38cccc5f44caba'),
Backends("oai-southcentralus-xxxxxxxx.openai.azure.com", 1 None, '21c14252762502e8fc78b61e21db114f'),
Backends("oai-westus-xxxxxxxx.openai.azure.com", 1, None, 'd6370785453b2b9c331a94cb1b7aaa36')
]
```

### Using the Load Balancer

As these are the only changes to the [OpenAI Python API library](https://github.com/openai/openai-python) implementation, simply execute your python code.
Expand Down
28 changes: 26 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,15 @@ It's also good to have some knowledge of authentication and identities.

## Authentication

Locally, you can log into Azure via the CLI and the steps below and use the `AzureDefaultCredential` (what I use in my example). When deploying this application in Azure, it's recommended to use a managed identity for authentication. It's best to avoid using the Azure OpenAI instances' keys as that could a) accidentally leave credentials in your source code, and b) the keys are different for each instance, which would probably require expanding upon the `Backends` class. Best to just avoid keys.
**We strongly recommend the use of a managed identity in Azure and the use of the `AzureDefaultCredential` locally.**

Locally, you can log into Azure via the CLI and the steps below and use the `AzureDefaultCredential` (what I use in my example). When deploying this application in Azure, it's recommended to use a managed identity for authentication.

### Azure OpenAI Keys

It's best to avoid using the Azure OpenAI instances' keys as that could a) accidentally leave credentials in your source code, and b) the keys are different for each instance, requiring maintenance, environment-specific keys, key rotations, etc. However, if you need to use keys, it is possible to set them for each Azure OpenAI backend.

When a backend's `api_key` property is set, the `api-key` header will be replaced with the `<api_key>` value prior to sending the request to the corresponding Azure OpenAI instance.

## Getting Started

Expand All @@ -51,9 +59,12 @@ Locally, you can log into Azure via the CLI and the steps below and use the `Azu

### Configuration

Execute the following git command to ensure that updates to `config.py` are not tracked and therefore not committed. This prevents accidental check-ins of real keys and values:
`git update-index --assume-unchanged config.py`

For the load-balanced approach, please use the same model across all instances.

1. Open [aoai.py](./aoai.py).
1. Open [config.py](./config.py).
1. Replace `<your-aoai-model>` with the name of your Azure OpenAI model.
1. Replace `<your-aoai-instance>` with the primary/single Azure OpenAI instance.
1. Replace `<your-aoai-instance-1>`, `<your-aoai-instance-2>`, `<your-aoai-instance-3>` with all the Azure OpenAI instances you want to load-balance across. Delete entries you don't need. See [Load Balancer Backend Configuration](#load-balancer-backend-configuration) for details.
Expand Down Expand Up @@ -193,3 +204,16 @@ backends = [
Backends("oai-westus-xxxxxxxx.openai.azure.com", 3)
]
```

### Backend Authentication

While we strongly recommend the use of managed identities, it is possible to use the Azure OpenAI API keys for each respective Azure OpenAI instance. Note that you are solely responsible for the safeguarding and injection of these keys.

```python
# Define the backends and their priority
backends = [
Backends("oai-eastus-xxxxxxxx.openai.azure.com", 1, None, 'c3d116584360f9960b38cccc5f44caba'),
Backends("oai-southcentralus-xxxxxxxx.openai.azure.com", 1 None, '21c14252762502e8fc78b61e21db114f'),
Backends("oai-westus-xxxxxxxx.openai.azure.com", 1, None, 'd6370785453b2b9c331a94cb1b7aaa36')
]
```
Loading

0 comments on commit 9fcd775

Please sign in to comment.