Update CloneAllOrganizationRepositoriesByHTTPS script to support pagination for organizations with more than 100 repositories #30
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
This PR addresses issue #8 by updating the
CloneAllOrganizationRepositoriesByHTTPS.shscript to handle organizations with more than 100 repositories through proper API pagination.Changes Made
fetch_all_repositories()function that automatically handles GitHub API paginationper_page=100instead ofper_page=200and fetches all pages as neededTechnical Details
The original script used a single API call with
per_page=200, which would miss repositories when an organization has more than 200 repos. The GitHub API has a maximum limit and uses pagination for large result sets.The new implementation:
per_page=100Testing
The script is now ready to handle organizations with any number of repositories, resolving the issue for when LinksPlatform grows beyond 100 repositories.
Fixes #8
🤖 Generated with Claude Code