Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve OpenAI Integration #34

Merged
merged 19 commits into from
Feb 8, 2025
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
19 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .cursorignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
target/
tmp/
.DS_Store
.git
42 changes: 35 additions & 7 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 3 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ thiserror = "2.0.11"
tokio = { version = "1.43", features = ["full"] }
futures = "0.3"
parking_lot = "0.12.3"
tracing = "0.1"

# CLI and UI
structopt = "0.3.26"
Expand Down Expand Up @@ -70,6 +71,8 @@ syntect = { version = "5.2", default-features = false, features = [
pulldown-cmark = "0.12"
comrak = "0.35"
textwrap = "0.16"
mustache = "0.9.0"
maplit = "1.0.2"
[dev-dependencies]
tempfile = "3.16.0"

Expand Down
50 changes: 37 additions & 13 deletions resources/prompt.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,42 @@
You are an AI assistant that generates concise and meaningful git commit messages based on provided diffs. Please adhere to the following guidelines:
You are an AI assistant specialized in generating precise and concise git commit messages based on provided diffs. Your task is to analyze the given diff and create a commit message that accurately reflects the changes made.

- Structure: Begin with a clear, present-tense summary.
- Content: While you should use the surrounding context to understand the changes, your commit message should ONLY describe the lines marked with + or -.
- Understanding: Use the context (unmarked lines) to understand the purpose and impact of the changes, but do not mention unchanged code in the commit message.
- Changes: Only describe what was actually changed (added, removed, or modified).
- Consistency: Maintain uniformity in tense, punctuation, and capitalization.
- Accuracy: Ensure the message accurately reflects the changes and their purpose.
- Present tense, imperative mood. (e.g., "Add x to y" instead of "Added x to y")
- Max {{max_commit_length}} chars in the output
Here is the git diff you need to analyze:

## Output:
<diff>
{{diff}}
</diff>

Your output should be a commit message generated from the input diff and nothing else. While you should use the surrounding context to understand the changes, your message should only describe what was actually modified (+ or - lines).
The character limit for the commit message is:

## Input:
<max_length>
{{max_length}}
</max_length>

INPUT:
Please follow these guidelines when generating the commit message:

1. Analyze the diff carefully, focusing on lines marked with + or -.
2. Identify the files changed and the nature of the changes (added, modified, or deleted).
3. Determine the most significant change if multiple changes are present.
4. Create a clear, present-tense summary of the change in the imperative mood.
5. Ensure the commit message is within the specified character limit.
6. For binary files or unreadable diffs:
- Use the format "Add/Update/Delete binary file <filename>"
- Include file size in parentheses if available
- For multiple binary files, list them separated by commas

Before generating the final commit message, please analyze the diff and but keep your thought process to your self:

1. Count and list all files changed in the diff, noting whether they were added, modified, or deleted. Prepend each file with a number.
2. For each changed file, summarize the key changes in bullet points and quote specific relevant lines from the diff.
3. Identify any binary files or unreadable diffs separately.
4. Determine the most significant change if multiple changes are present.
5. Consider the impact of each change and its relevance to the overall commit message.
6. Brainstorm keywords that could be used in the commit message.
7. Propose three potential single-line summaries based on the breakdown.
8. Count the characters in each proposed summary, ensuring they meet the specified character limit.
9. Select the best summary that accurately reflects the most significant change and meets the character limit.
10. Prefixes such as `refactor:`, `fix` should be removed

After your analysis, provide only the final commit message as output. Ensure it is clear, concise, and accurately reflects the content of the diff while adhering to the character limit. Do not include any additional text or explanations in your final output.

<COMMIT MESSAGE>
35 changes: 35 additions & 0 deletions src/client.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
use anyhow::{Context, Result};
use serde_json;

use crate::model::Model;
use crate::openai::{self, Request as OpenAIRequest};

#[derive(Debug, Clone, PartialEq)]
pub struct Request {
pub prompt: String,
pub system: String,
pub max_tokens: u16,
pub model: Model
}

#[derive(Debug, Clone, PartialEq)]
pub struct Response {
pub response: String
}

pub async fn call(request: Request) -> Result<Response> {
// Use the OpenAI client for all models
let openai_request = OpenAIRequest {
prompt: request.prompt,
system: request.system,
max_tokens: request.max_tokens,
model: request.model
};

let response = openai::call(openai_request).await?;
Ok(Response { response: response.response })
}

pub async fn is_model_available(_model: Model) -> bool {
true // OpenAI models are always considered available if API key is set
}
39 changes: 29 additions & 10 deletions src/commit.rs
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
use anyhow::{bail, Result};
use maplit::hashmap;

use crate::{config, openai, profile};
use crate::model::Model;
Expand All @@ -8,9 +9,17 @@ const INSTRUCTION_TEMPLATE: &str = include_str!("../resources/prompt.md");

/// Returns the instruction template for the AI model.
/// This template guides the model in generating appropriate commit messages.
fn get_instruction_template() -> String {
fn get_instruction_template() -> Result<String> {
profile!("Generate instruction template");
INSTRUCTION_TEMPLATE.replace("{{max_commit_length}}", &config::APP.max_commit_length.unwrap_or(72).to_string())
let max_length = config::APP.max_commit_length.unwrap_or(72).to_string();
let template = mustache::compile_str(INSTRUCTION_TEMPLATE)
.map_err(|e| anyhow::anyhow!("Template compilation error: {}", e))?
.render_to_string(&hashmap! {
"max_length" => max_length,
"diff" => "".to_string(),
})
.map_err(|e| anyhow::anyhow!("Template rendering error: {}", e))?;
Ok(template)
}

/// Calculates the number of tokens used by the instruction template.
Expand All @@ -22,7 +31,8 @@ fn get_instruction_template() -> String {
/// * `Result<usize>` - The number of tokens used or an error
pub fn get_instruction_token_count(model: &Model) -> Result<usize> {
profile!("Calculate instruction tokens");
model.count_tokens(&get_instruction_template())
let template = get_instruction_template()?;
model.count_tokens(&template)
}

/// Creates an OpenAI request for commit message generation.
Expand All @@ -33,15 +43,24 @@ pub fn get_instruction_token_count(model: &Model) -> Result<usize> {
/// * `model` - The AI model to use for generation
///
/// # Returns
/// * `openai::Request` - The prepared request
fn create_commit_request(diff: String, max_tokens: usize, model: Model) -> openai::Request {
/// * `Result<openai::Request>` - The prepared request
pub fn create_commit_request(diff: String, max_tokens: usize, model: Model) -> Result<openai::Request> {
profile!("Prepare OpenAI request");
openai::Request {
system: get_instruction_template(),
prompt: diff,
let max_length = config::APP.max_commit_length.unwrap_or(72).to_string();
let instruction_template = mustache::compile_str(INSTRUCTION_TEMPLATE)
.map_err(|e| anyhow::anyhow!("Template compilation error: {}", e))?
.render_to_string(&hashmap! {
"max_length" => max_length,
"diff" => diff,
})
.map_err(|e| anyhow::anyhow!("Template rendering error: {}", e))?;

Ok(openai::Request {
system: instruction_template,
prompt: "".to_string(),
max_tokens: max_tokens.try_into().unwrap_or(u16::MAX),
model
}
})
}

/// Generates a commit message using the AI model.
Expand All @@ -65,7 +84,7 @@ pub async fn generate(patch: String, remaining_tokens: usize, model: Model) -> R
bail!("Maximum token count must be greater than zero")
}

let request = create_commit_request(patch, remaining_tokens, model);
let request = create_commit_request(patch, remaining_tokens, model)?;
openai::call(request).await
}

Expand Down
Loading
Loading