Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Various improvements #4

Merged
merged 2 commits into from
May 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 11 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,16 +42,17 @@ jobs:
openai_api_key: ${{ secrets.OPENAI_API_KEY }}
```

| Input | Description | Required | Default |
| ----------------- | ----------------------------------------------------- | -------- | -------------------------- |
| `github_token` | The GitHub token to use for the Action | Yes | |
| `openai_api_key` | The [OpenAI API key] to use, keep it hidden | Yes | |
| `pull_request_id` | The ID of the pull request to use | No | Extracted from metadata |
| `openai_model` | The [OpenAI model] to use | No | `gpt-3.5-turbo` |
| `max_tokens` | The maximum number of **prompt tokens** to use | No | `1000` |
| `temperature` | Higher values will make the model more creative (0-2) | No | `0.6` |
| `sample_prompt` | The prompt to use for giving context to the model | No | See `SAMPLE_PROMPT` |
| `sample_response` | A sample response for giving context to the model | No | See `GOOD_SAMPLE_RESPONSE` |
| Input | Description | Required | Default |
| ------------------- | -------------------------------------------------------------- | -------- | -------------------------- |
| `github_token` | The GitHub token to use for the Action | Yes | |
| `openai_api_key` | The [OpenAI API key] to use, keep it hidden | Yes | |
| `pull_request_id` | The ID of the pull request to use | No | Extracted from metadata |
| `openai_model` | The [OpenAI model] to use | No | `gpt-3.5-turbo` |
| `max_tokens` | The maximum number of **prompt tokens** to use | No | `1000` |
| `temperature` | Higher values will make the model more creative (0-2) | No | `0.6` |
| `sample_prompt` | The prompt to use for giving context to the model | No | See `SAMPLE_PROMPT` |
| `sample_response` | A sample response for giving context to the model | No | See `GOOD_SAMPLE_RESPONSE` |
| `completion_prompt` | The prompt to use for the model to generate the PR description | No | See `COMPLETION_PROMPT` |


[OpenAI API key]: https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key
Expand Down
4 changes: 4 additions & 0 deletions action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,10 @@ inputs:
description: 'A sample of an ideal response based on the sample prompt'
required: false
default: ''
completion_prompt:
description: 'Prompt to use as the final prompt to the model, refer to COMPLETION_PROMPT in the Python file.'
required: false
default: ''

runs:
using: 'docker'
Expand Down
20 changes: 12 additions & 8 deletions autofill_description.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,11 @@
Overall, this change will improve the quality of the project by helping us detect and prevent memory errors.
"""

COMPLETION_PROMPT = f"""
Write a pull request description focusing on the motivation behind the change and why it improves the project.
Go straight to the point. The following changes took place: \n
"""


def main():
parser = argparse.ArgumentParser(
Expand Down Expand Up @@ -98,6 +103,7 @@ def main():
model_sample_response = os.environ.get(
"INPUT_MODEL_SAMPLE_RESPONSE", GOOD_SAMPLE_RESPONSE
)
completion_prompt = os.environ.get("INPUT_COMPLETION_PROMPT", COMPLETION_PROMPT)
authorization_header = {
"Accept": "application/vnd.github.v3+json",
"Authorization": "token %s" % github_token,
Expand Down Expand Up @@ -153,12 +159,6 @@ def main():

pull_request_files.extend(pull_files_chunk)

completion_prompt = f"""
Write a pull request description focusing on the motivation behind the change and why it improves the project.
Go straight to the point.

The title of the pull request is "{pull_request_title}" and the following changes took place: \n
"""
for pull_request_file in pull_request_files:
# Not all PR file metadata entries may contain a patch section
# For example, entries related to removed binary files may not contain it
Expand All @@ -175,8 +175,8 @@ def main():
if len(completion_prompt) > max_allowed_characters:
completion_prompt = completion_prompt[:max_allowed_characters]

openai.api_key = openai_api_key
openai_response = openai.ChatCompletion.create(
openai_client = openai.OpenAI(api_key=openai_api_key)
openai_response = openai_client.chat.completions.create(
model=open_ai_model,
messages=[
{
Expand All @@ -185,6 +185,10 @@ def main():
},
{"role": "user", "content": model_sample_prompt},
{"role": "assistant", "content": model_sample_response},
{
"role": "user",
"content": "Title of the pull request: " + pull_request_title,
},
{"role": "user", "content": completion_prompt},
],
temperature=model_temperature,
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
requests>=2.18
openai==0.27.2
openai==1.27.0