feat: support using the action with Azure-hosted OpenAI models#47
Merged
pakrym-oai merged 1 commit intomainfrom Nov 5, 2025
Merged
feat: support using the action with Azure-hosted OpenAI models#47pakrym-oai merged 1 commit intomainfrom
pakrym-oai merged 1 commit intomainfrom
Conversation
dbf8065 to
c9cc493
Compare
bolinfest
added a commit
to openai/codex
that referenced
this pull request
Nov 3, 2025
This PR introduces an `--upstream-url` option to the proxy CLI that determines the URL that Responses API requests should be forwarded to. To preserve existing behavior, the default value is `"https://api.openai.com/v1/responses"`. The motivation for this change is that the [Codex GitHub Action](https://github.com/openai/codex-action) should support those who use the OpenAI Responses API via Azure. Relevant issues: - openai/codex-action#28 - openai/codex-action#38 - openai/codex-action#44 Though rather than introduce a bunch of new Azure-specific logic in the action as openai/codex-action#44 proposes, we should leverage our Responses API proxy to get the _hardening_ benefits it provides: https://github.com/openai/codex/blob/d5853d9c47b1badad183f62622745cf47e6ff0f4/codex-rs/responses-api-proxy/README.md#hardening-details This PR should make this straightforward to incorporate in the action. To see how the updated version of the action would consume these new options, see openai/codex-action#47.
6d89768 to
7de6611
Compare
pakrym-oai
reviewed
Nov 5, 2025
| required: false | ||
| default: "" | ||
| responses-api-endpoint: | ||
| description: "Optional Responses API endpoint override, e.g. https://example.openai.azure.com/openai/v1/responses. Defaults to the proxy's built-in endpoint when empty." |
Collaborator
There was a problem hiding this comment.
Defaults to Responses API when empty?
pakrym-oai
reviewed
Nov 5, 2025
|
|
||
| To configure the Action to use OpenAI models hosted on Azure, pay close attention to the following: | ||
|
|
||
| - The `responses-api-endpoint` must be set to the full URL (including any required query parameters) that Codex will `POST` to for a Responses API request. For Azure, this might look like `https://YOUR_PROJECT_NAME.openai.azure.com/openai/v1/responses`. Note that [unlike when customizing a model provider in Codex](https://github.com/openai/codex/blob/main/docs/config.md#azure-model-provider-example), you must include the `v1/responses` suffix to the URL yourself, if appropriate. |
Collaborator
There was a problem hiding this comment.
Should we mention api-version query parameter?
pakrym-oai
approved these changes
Nov 5, 2025
Collaborator
|
Thank you! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR introduces a new, optional input to the GitHub Action,
responses-api-endpoint, which makes it possible to use this Action with Azure. As explained in the updatedREADME.md, ifresponses-api-endpointis specified, it will be the URL that Codex hits to make Responses API requests. In practice, we expect this to be used mostly by Azure users.With the Azure account I just created, I got this working with the following configuration:
Under the hood, if
responses-api-endpointis non-empty, it simply becomes the--upstream-urlargument tocodex-responses-api-proxy.TO DISCUSS: Better/shorter name than
responses-api-endpoint?This PR builds on the work of openai/codex#6129.