Skip to content

Conversation

@EngineerNV
Copy link
Owner

Summary

  • replace the vulnerable standalone keras dependency with TensorFlow's built-in keras implementation and require a modern tensorflow release
  • update project modules to import layers, models, callbacks, and utilities from tensorflow.keras instead of keras
  • modernize the YOLO loss implementation to use TensorFlow 2.x friendly operations and maintain training state safely

Testing

  • python -m compileall backend.py frontend.py preprocessing.py experimental/yolo3_detect.py

https://chatgpt.com/codex/tasks/task_e_68ce6ebbf21083219a0854616942084d

@EngineerNV EngineerNV merged commit 2d82799 into master Sep 20, 2025
1 check passed
Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Codex Review: Here are some suggestions.

Reply with @codex fix comments to fix any unresolved comments.

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

############################################

optimizer = Adam(lr=learning_rate, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.0)
optimizer = Adam(learning_rate=learning_rate, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.0)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[P0] Remove unsupported decay argument from tf.keras Adam

The optimizer initialization still passes decay=0.0, which was valid for keras.optimizers.Adam but is no longer accepted by tf.keras.optimizers.Adam in the TensorFlow ≥2.15 dependency introduced in this commit. Instantiating the optimizer will raise TypeError: __init__() got an unexpected keyword argument 'decay', preventing the model from compiling or training. Consider dropping the argument or switching to tf.keras.optimizers.legacy.Adam if the legacy API is required.

Useful? React with 👍 / 👎.

@EngineerNV EngineerNV deleted the codex/remove-vulnerable-keras-dependency branch September 20, 2025 09:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants