Skip to content
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.

Latest commit

 

History

History
68 lines (42 loc) · 2.39 KB

CONTRIBUTING.md

File metadata and controls

68 lines (42 loc) · 2.39 KB

Contributing to AutoLLM 🌟

Thank you for considering a contribution to AutoLLM. Your input is invaluable to our project's continued growth and improvement.

PR Guidelines 📝

To streamline the integration of your contributions:

  • Start by Forking 🍴: This allows you to work on your own copy of the project. See these steps to open a PR from your fork.

  • New Branch 🌱: Always create a new branch for your PR. It keeps things neat and makes the review process smoother.

  • Size Matters 📏: Aim for smaller PRs. If you have a big feature in mind, consider breaking it up. It helps us understand your contribution better and gets you feedback quicker!

  • Stay Current 🕰️: Ensure your PR is synchronized with the latest updates from the safevideo/autollm main branch. If your branch is outdated, update it using the 'Update branch' button or by executing git pull and git merge main.

Code Standards 🛠️

Maintaining a consistent codebase is crucial. We utilize tools such as flake8 and isort to achieve this.

Pre-commit Hooks 🔗

  1. Installation:

    pip install autollm[dev]
  2. Pre-commit Setup:

    pre-commit install
    pre-commit run --all-files

Upon setup, the pre-commit hooks will automatically check and format code during commits.

Docstrings 📜

For functions or classes that warrant explanation, we use docstrings adhering to the google-style format:

"""
    Brief description of the function's purpose.

    Parameters:
        arg1: Description of the first argument.
        arg2: Description of the second argument.

    Returns:
        Expected return values or outcomes.

    Raises:
        Potential exceptions and reasons for them.
"""

Testing 🔍

Before finalizing your PR, ensure it aligns with our existing test suite:

pytest

Your interest and potential contributions to AutoLLM are greatly appreciated. Together, we can continue refining and expanding AutoLLM for the broader community.