-
Notifications
You must be signed in to change notification settings - Fork 124
components batch_score_llm
github-actions[bot] edited this page Apr 30, 2024
·
13 revisions
Version: 1.1.7
View in Studio: https://ml.azure.com/registries/azureml/components/batch_score_llm/version/1.1.7
Predefined arguments for parallel job:
Name | Description | Type | Default | Optional | Enum |
---|---|---|---|---|---|
resume_from | The pipeline run id to resume from | string | True |
PRS preview feature
Name | Description | Type | Default | Optional | Enum |
---|---|---|---|---|---|
async_mode | Whether to use PRS mini-batch streaming feature, which allows each PRS processor to process multiple mini-batches at a time. | boolean | False | True |
Custom arguments
Name | Description | Type | Default | Optional | Enum |
---|---|---|---|---|---|
configuration_file | Configures the behavior of batch scoring. | uri_file | False | ||
data_input_table | The data to be split and scored in parallel. | mltable | False |
Name | Description | Type |
---|---|---|
job_output_path | uri_file | |
mini_batch_results_output_directory | uri_folder |