Skip to content

Conversation

googs1025
Copy link
Collaborator

Pull Request Description

[Please provide a clear and concise description of your changes here]

Related Issues

Resolves: #[Insert issue number(s)]

fix: #1591

Important: Before submitting, please complete the description above and review the checklist below.


Contribution Guidelines (Expand for Details)

We appreciate your contribution to aibrix! To ensure a smooth review process and maintain high code quality, please adhere to the following guidelines:

Pull Request Title Format

Your PR title should start with one of these prefixes to indicate the nature of the change:

  • [Bug]: Corrections to existing functionality
  • [CI]: Changes to build process or CI pipeline
  • [Docs]: Updates or additions to documentation
  • [API]: Modifications to aibrix's API or interface
  • [CLI]: Changes or additions to the Command Line Interface
  • [Misc]: For changes not covered above (use sparingly)

Note: For changes spanning multiple categories, use multiple prefixes in order of importance.

Submission Checklist

  • PR title includes appropriate prefix(es)
  • Changes are clearly explained in the PR description
  • New and existing tests pass successfully
  • Code adheres to project style and best practices
  • Documentation updated to reflect changes (if applicable)
  • Thorough testing completed, no regressions introduced

By submitting this PR, you confirm that you've read these guidelines and your changes align with the project's contribution standards.

Copy link
Contributor

Summary of Changes

Hello @googs1025, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the PodAutoscaler's capabilities by introducing support for multiple metrics. This allows for more sophisticated and robust autoscaling decisions, as the system can now consider various performance indicators simultaneously. The core scaling algorithms and internal context management have been adapted to accommodate this change, ensuring that the autoscaler can effectively respond to diverse workload demands by scaling based on the most critical metric at any given time.

Highlights

  • Multi-Metric Support: The PodAutoscaler now supports defining multiple metrics for scaling. The system will calculate desired replicas for each configured metric and then apply the maximum of these recommendations.
  • Algorithm Updates: Both the APA (Adaptive Pod Autoscaler) and KPA (Knative Pod Autoscaler) algorithms have been updated to accept a specific metric name when computing target replicas, allowing them to retrieve the correct target value from the scaling context.
  • Scaling Context Refactoring: The internal ScalingContext has been refactored to store metric-specific target values in a map (MetricTargets) keyed by metric name, replacing the previous single TargetValue and ScalingMetric fields.
  • Validation Logic Enhanced: The validation logic for PodAutoscaler resources has been updated to correctly handle and validate multiple metricsSources defined in the spec.
  • New Sample Configuration: A new sample YAML file (multimetrics-apa.yaml) has been added to demonstrate how to configure a PodAutoscaler to use multiple metrics for scaling.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces support for multiple metrics in the PodAutoscaler, a significant enhancement. The core logic is updated to iterate through all specified metric sources, compute a scaling recommendation for each, and then select the maximum desired replica count. This ensures the system scales up to meet the demand of the most constrained resource. The changes are well-structured, touching the autoscaler, context, and validation logic, and are accompanied by updated tests and a new sample YAML for multi-metric configuration. My review identified one bug in the implementation where the scaling reason and algorithm were not correctly propagated when using multiple metrics. I've provided a code suggestion to address this. Overall, this is a great improvement to the autoscaler's flexibility.

@googs1025 googs1025 force-pushed the multi_metrics_for_podautoscaler branch 2 times, most recently from 92b16d1 to e6d6c8d Compare October 20, 2025 11:37
@googs1025 googs1025 marked this pull request as draft October 20, 2025 12:31
// NewNamespaceNameMetric creates a MetricKey based on the PodAutoscaler's metrics source.
// For consistency, it will return the corresponding MetricSource.
// Currently, it supports only a single metric source. In the future, this could be extended to handle multiple metric sources.
func NewNamespaceNameMetric(pa *autoscalingv1alpha1.PodAutoscaler) (types.MetricKey, autoscalingv1alpha1.MetricSource, error) {
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove this func

@googs1025 googs1025 requested a review from Jeffwan October 21, 2025 01:51
@googs1025
Copy link
Collaborator Author

note: Do we need to limit the number of metricsSources slices at the api level?

@googs1025 googs1025 marked this pull request as ready for review October 21, 2025 01:52
@Jeffwan
Copy link
Collaborator

Jeffwan commented Oct 21, 2025

Do we need to limit the number of metricsSources slices at the api level?

I think we do not need to limit it at this moment.

Signed-off-by: googs1025 <googs1025@gmail.com>
@googs1025 googs1025 force-pushed the multi_metrics_for_podautoscaler branch from e6d6c8d to 9e0bb53 Compare October 21, 2025 05:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support multi-metrics in podautoscaler

2 participants