Skip to content

Conversation

james77777778
Copy link
Contributor

Fixes #21695

Also fixes the bfloat16 dtype of glu with the numpy backend.

Copy link
Contributor

Summary of Changes

Hello @james77777778, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses two key issues related to the glu (Gated Linear Unit) operation: it ensures that the KerasTensor output shape is correctly inferred by halving the relevant dimension, and it resolves a data type inconsistency that occurred when using bfloat16 with the NumPy backend, ensuring precision is maintained.

Highlights

  • Corrected KerasTensor Output Shape for GLU: The compute_output_spec method for the GLU operation has been updated to accurately reflect the output shape, which involves halving the dimension along the specified axis.
  • Resolved bfloat16 Dtype Issue in NumPy Backend: The glu function in the NumPy backend now explicitly casts its output to the original input dtype, fixing an issue where bfloat16 precision was not maintained during intermediate calculations.
  • Updated and Enabled GLU Tests: Tests for the glu operation have been modified to use input shapes that are divisible by two and to assert the correct halved output shapes. Additionally, the skip for bfloat16 dtype tests in glu has been removed, confirming the fix.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly fixes the output shape computation for the glu operation when used with a KerasTensor, ensuring the dimension along the specified axis is halved. It also resolves a bfloat16 dtype issue within the NumPy backend for glu. The accompanying test modifications accurately reflect these fixes. I have one suggestion to improve the implementation in the NumPy backend for better code reuse and maintainability.

@codecov-commenter
Copy link

codecov-commenter commented Sep 25, 2025

Codecov Report

❌ Patch coverage is 62.50000% with 3 lines in your changes missing coverage. Please review.
✅ Project coverage is 82.57%. Comparing base (b12b2af) to head (8da76fa).
⚠️ Report is 1 commits behind head on master.

Files with missing lines Patch % Lines
keras/src/ops/nn.py 50.00% 1 Missing and 2 partials ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master   #21696      +/-   ##
==========================================
+ Coverage   76.83%   82.57%   +5.73%     
==========================================
  Files         572      572              
  Lines       58159    58165       +6     
  Branches     9104     9106       +2     
==========================================
+ Hits        44689    48031    +3342     
+ Misses      11211     7808    -3403     
- Partials     2259     2326      +67     
Flag Coverage Δ
keras 82.37% <62.50%> (+5.63%) ⬆️
keras-jax 63.38% <37.50%> (-0.01%) ⬇️
keras-numpy 57.71% <62.50%> (-0.01%) ⬇️
keras-openvino 34.27% <0.00%> (-0.01%) ⬇️
keras-tensorflow 64.12% <37.50%> (-0.01%) ⬇️
keras-torch 63.71% <37.50%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Collaborator

@fchollet fchollet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fix!

@google-ml-butler google-ml-butler bot added kokoro:force-run ready to pull Ready to be merged into the codebase labels Sep 29, 2025
@fchollet fchollet merged commit b062368 into keras-team:master Sep 29, 2025
8 checks passed
@james77777778 james77777778 deleted the fix-glu branch September 30, 2025 08:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

kokoro:force-run ready to pull Ready to be merged into the codebase size:S

Projects

None yet

Development

Successfully merging this pull request may close these issues.

GLU shape computed incorrectly

4 participants