-
Notifications
You must be signed in to change notification settings - Fork 10.9k
Add unified jobs API with /api/jobs endpoints #11054
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
ric-yu
wants to merge
21
commits into
comfyanonymous:master
Choose a base branch
from
ric-yu:feature/unified-jobs-api
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+865
−0
Open
Changes from all commits
Commits
Show all changes
21 commits
Select commit
Hold shift + click to select a range
5aa07d7
feat: create a /jobs api to return queue and history jobs
ric-yu aa00641
update unused vars
ric-yu b2bd48e
include priority
ric-yu b874f46
create jobs helper file
ric-yu 380b6ae
fix ruff
ric-yu 048c413
update how we set error message
ric-yu 2e0b26b
include execution error in both responses
ric-yu e4c7136
rename error -> failed, fix output shape
ric-yu 90fb5cc
re-use queue and history functions
ric-yu 8a2bb7c
set workflow id
ric-yu b7c7712
allow srot by exec duration
ric-yu a38aacf
fix tests
ric-yu 1b67306
send priority and remove error msg
ric-yu 5274b91
use ws messages to get start and end times
ric-yu c860cc6
revert main.py fully
ric-yu 5ed90a1
refactor: move all /jobs business logic to jobs.py
ric-yu b14ae80
fix failing test
ric-yu a10cb30
remove some tests
ric-yu 86590ca
fix non dict nodes
ric-yu 460b848
address comments
ric-yu 1f7c1a9
filter by workflow id and remove null fields
ric-yu File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,262 @@ | ||
| """ | ||
| Job utilities for the /api/jobs endpoint. | ||
| Provides normalization and helper functions for job status tracking. | ||
| """ | ||
|
|
||
| from comfy_api.internal import prune_dict | ||
|
|
||
|
|
||
| class JobStatus: | ||
| """Job status constants.""" | ||
| PENDING = 'pending' | ||
| IN_PROGRESS = 'in_progress' | ||
| COMPLETED = 'completed' | ||
| FAILED = 'failed' | ||
|
|
||
| ALL = [PENDING, IN_PROGRESS, COMPLETED, FAILED] | ||
|
|
||
|
|
||
| # Media types that can be previewed in the frontend | ||
| PREVIEWABLE_MEDIA_TYPES = frozenset({'images', 'video', 'audio'}) | ||
|
|
||
| # 3D file extensions for preview fallback (no dedicated media_type exists) | ||
| THREE_D_EXTENSIONS = frozenset({'.obj', '.fbx', '.gltf', '.glb'}) | ||
|
|
||
|
|
||
| def is_previewable(media_type, item): | ||
| """ | ||
| Check if an output item is previewable. | ||
| Matches frontend logic in ComfyUI_frontend/src/stores/queueStore.ts | ||
|
|
||
| Priority: | ||
| 1. media_type is 'images', 'video', or 'audio' | ||
| 2. format field starts with 'video/' or 'audio/' | ||
| 3. filename has a 3D extension (.obj, .fbx, .gltf, .glb) | ||
| """ | ||
| if media_type in PREVIEWABLE_MEDIA_TYPES: | ||
| return True | ||
|
|
||
| # Check format field (MIME type) | ||
| fmt = item.get('format', '') | ||
| if fmt and (fmt.startswith('video/') or fmt.startswith('audio/')): | ||
| return True | ||
|
|
||
| # Check for 3D files by extension | ||
| filename = item.get('filename', '').lower() | ||
| if any(filename.endswith(ext) for ext in THREE_D_EXTENSIONS): | ||
| return True | ||
|
|
||
| return False | ||
|
|
||
|
|
||
| def normalize_queue_item(item, status): | ||
| """Convert queue item tuple to unified job dict.""" | ||
| priority, prompt_id, _, extra_data, _ = item[:5] | ||
| create_time = extra_data.get('create_time') | ||
| extra_pnginfo = extra_data.get('extra_pnginfo') or {} | ||
| workflow_id = extra_pnginfo.get('workflow', {}).get('id') | ||
|
|
||
| return prune_dict({ | ||
| 'id': prompt_id, | ||
| 'status': status, | ||
| 'priority': priority, | ||
| 'create_time': create_time, | ||
| 'outputs_count': 0, | ||
| 'workflow_id': workflow_id, | ||
| }) | ||
|
|
||
|
|
||
| def normalize_history_item(prompt_id, history_item, include_outputs=False): | ||
| """Convert history item dict to unified job dict.""" | ||
| prompt_tuple = history_item['prompt'] | ||
| priority, _, prompt, extra_data, _ = prompt_tuple[:5] | ||
| create_time = extra_data.get('create_time') | ||
| extra_pnginfo = extra_data.get('extra_pnginfo') or {} | ||
| workflow_id = extra_pnginfo.get('workflow', {}).get('id') | ||
|
|
||
| status_info = history_item.get('status', {}) | ||
| status_str = status_info.get('status_str') if status_info else None | ||
| if status_str == 'success': | ||
| status = JobStatus.COMPLETED | ||
| elif status_str == 'error': | ||
| status = JobStatus.FAILED | ||
| else: | ||
| status = JobStatus.COMPLETED | ||
|
|
||
| outputs = history_item.get('outputs') or {} | ||
| outputs_count, preview_output = get_outputs_summary(outputs) | ||
|
|
||
| execution_error = None | ||
| execution_start_time = None | ||
| execution_end_time = None | ||
| if status_info: | ||
| messages = status_info.get('messages', []) | ||
| for entry in messages: | ||
| if isinstance(entry, (list, tuple)) and len(entry) >= 2: | ||
| event_name, event_data = entry[0], entry[1] | ||
| if isinstance(event_data, dict): | ||
| if event_name == 'execution_start': | ||
| execution_start_time = event_data.get('timestamp') | ||
| elif event_name in ('execution_success', 'execution_error', 'execution_interrupted'): | ||
| execution_end_time = event_data.get('timestamp') | ||
| if event_name == 'execution_error': | ||
| execution_error = event_data | ||
|
|
||
| job = prune_dict({ | ||
| 'id': prompt_id, | ||
| 'status': status, | ||
| 'priority': priority, | ||
| 'create_time': create_time, | ||
| 'execution_start_time': execution_start_time, | ||
| 'execution_end_time': execution_end_time, | ||
| 'execution_error': execution_error, | ||
| 'outputs_count': outputs_count, | ||
| 'preview_output': preview_output, | ||
| 'workflow_id': workflow_id, | ||
| }) | ||
|
|
||
| if include_outputs: | ||
| job['outputs'] = outputs | ||
| job['execution_status'] = status_info | ||
| job['workflow'] = { | ||
| 'prompt': prompt, | ||
| 'extra_data': extra_data, | ||
| } | ||
|
|
||
| return job | ||
|
|
||
|
|
||
| def get_outputs_summary(outputs): | ||
| """ | ||
| Count outputs and find preview in a single pass. | ||
| Returns (outputs_count, preview_output). | ||
|
|
||
| Preview priority (matching frontend): | ||
| 1. type="output" with previewable media | ||
| 2. Any previewable media | ||
| """ | ||
| count = 0 | ||
| preview_output = None | ||
| fallback_preview = None | ||
|
|
||
| for node_id, node_outputs in outputs.items(): | ||
| if not isinstance(node_outputs, dict): | ||
| continue | ||
| for media_type, items in node_outputs.items(): | ||
| if media_type == 'animated' or not isinstance(items, list): | ||
| continue | ||
|
|
||
| for item in items: | ||
| if not isinstance(item, dict): | ||
| continue | ||
| count += 1 | ||
|
|
||
| if preview_output is None and is_previewable(media_type, item): | ||
| enriched = { | ||
| **item, | ||
| 'nodeId': node_id, | ||
| 'mediaType': media_type | ||
| } | ||
| if item.get('type') == 'output': | ||
| preview_output = enriched | ||
| elif fallback_preview is None: | ||
| fallback_preview = enriched | ||
|
|
||
| return count, preview_output or fallback_preview | ||
|
|
||
|
|
||
| def apply_sorting(jobs, sort_by, sort_order): | ||
| """Sort jobs list by specified field and order.""" | ||
| reverse = (sort_order == 'desc') | ||
|
|
||
| if sort_by == 'execution_duration': | ||
| def get_sort_key(job): | ||
| start = job.get('execution_start_time') or 0 | ||
| end = job.get('execution_end_time') or 0 | ||
| return end - start if end and start else 0 | ||
| else: | ||
| def get_sort_key(job): | ||
| return job.get('create_time') or 0 | ||
|
|
||
| return sorted(jobs, key=get_sort_key, reverse=reverse) | ||
|
|
||
|
|
||
| def get_job(prompt_id, running, queued, history): | ||
| """ | ||
| Get a single job by prompt_id from history or queue. | ||
|
|
||
| Args: | ||
| prompt_id: The prompt ID to look up | ||
| running: List of currently running queue items | ||
| queued: List of pending queue items | ||
| history: Dict of history items keyed by prompt_id | ||
|
|
||
| Returns: | ||
| Job dict with full details, or None if not found | ||
| """ | ||
| if prompt_id in history: | ||
| return normalize_history_item(prompt_id, history[prompt_id], include_outputs=True) | ||
|
|
||
| for item in running: | ||
| if item[1] == prompt_id: | ||
| return normalize_queue_item(item, JobStatus.IN_PROGRESS) | ||
|
|
||
| for item in queued: | ||
| if item[1] == prompt_id: | ||
| return normalize_queue_item(item, JobStatus.PENDING) | ||
|
|
||
| return None | ||
|
|
||
|
|
||
| def get_all_jobs(running, queued, history, status_filter=None, workflow_id=None, sort_by="created_at", sort_order="desc", limit=None, offset=0): | ||
| """ | ||
| Get all jobs (running, pending, completed) with filtering and sorting. | ||
|
|
||
| Args: | ||
| running: List of currently running queue items | ||
| queued: List of pending queue items | ||
| history: Dict of history items keyed by prompt_id | ||
| status_filter: List of statuses to include (from JobStatus.ALL) | ||
| workflow_id: Filter by workflow ID | ||
| sort_by: Field to sort by ('created_at', 'execution_duration') | ||
| sort_order: 'asc' or 'desc' | ||
| limit: Maximum number of items to return | ||
| offset: Number of items to skip | ||
|
|
||
| Returns: | ||
| tuple: (jobs_list, total_count) | ||
| """ | ||
| jobs = [] | ||
|
|
||
| if status_filter is None: | ||
| status_filter = JobStatus.ALL | ||
|
|
||
| if JobStatus.IN_PROGRESS in status_filter: | ||
| for item in running: | ||
| jobs.append(normalize_queue_item(item, JobStatus.IN_PROGRESS)) | ||
|
|
||
| if JobStatus.PENDING in status_filter: | ||
| for item in queued: | ||
| jobs.append(normalize_queue_item(item, JobStatus.PENDING)) | ||
|
|
||
| include_completed = JobStatus.COMPLETED in status_filter | ||
| include_failed = JobStatus.FAILED in status_filter | ||
| if include_completed or include_failed: | ||
| for prompt_id, history_item in history.items(): | ||
| is_failed = history_item.get('status', {}).get('status_str') == 'error' | ||
| if (is_failed and include_failed) or (not is_failed and include_completed): | ||
| jobs.append(normalize_history_item(prompt_id, history_item)) | ||
|
|
||
| if workflow_id: | ||
| jobs = [j for j in jobs if j.get('workflow_id') == workflow_id] | ||
|
|
||
| jobs = apply_sorting(jobs, sort_by, sort_order) | ||
|
|
||
| total_count = len(jobs) | ||
|
|
||
| if offset > 0: | ||
| jobs = jobs[offset:] | ||
| if limit is not None: | ||
| jobs = jobs[:limit] | ||
|
|
||
| return (jobs, total_count) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
similar comment to before -- likely out of scope for this, but can we start introducing some structure to this repo and start breaking up this superclass to separate files for maintainability and extensibility