Conversation
There was a problem hiding this comment.
Pull request overview
This PR introduces a caching mechanism for GitHub workflow file lookups to reduce redundant API calls when triggering workflows. The implementation adds a module-level cache dictionary and a new get_workflow() method that checks the cache before fetching from the GitHub API.
Key changes:
- Introduces
_WORKFLOW_FILE_CACHEmodule-level dictionary to store Workflow objects - Adds
get_workflow()async method to retrieve cached workflows or fetch from GitHub - Updates
trigger()method to use the new caching mechanism instead of direct API calls
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| logger.info(f"Returning cached workflow {self.workflow_file}") | ||
| return _WORKFLOW_FILE_CACHE[self.workflow_file] | ||
| logger.info(f"Fetching workflow {self.workflow_file} from GitHub") | ||
| workflow = self.repo.get_workflow(self.workflow_file) |
There was a problem hiding this comment.
The get_workflow method is declared as async but doesn't perform any await operations. Line 222 calls self.repo.get_workflow synchronously, which is a blocking network I/O operation that was previously wrapped with asyncio.to_thread at line 246. Removing the asyncio.to_thread wrapper without making the method properly async means this now blocks the event loop. Either add back the asyncio.to_thread wrapper, or change this to a synchronous method if blocking is acceptable.
| workflow = self.repo.get_workflow(self.workflow_file) | |
| workflow = await asyncio.to_thread(self.repo.get_workflow, self.workflow_file) |
No description provided.