Skip to content

Commit

Permalink
Update clm.py
Browse files Browse the repository at this point in the history
  • Loading branch information
AakritiKinra authored Jan 1, 2025
1 parent d19bf86 commit f5ac6cd
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions llments/eval/factscore/clm.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,10 +64,10 @@ def load_model(self) -> None:
def _generate(
self,
prompt: str,
prompts: Union[str, List[str]],
sample_idx: int = 0,
max_sequence_length: int = 2048,
max_output_length: int = 128,
prompts: Union[str, List[str]] = None,
end_if_newline: bool = False,
end_if_second_newline: bool = False,
verbose: bool = False,
Expand All @@ -76,12 +76,12 @@ def _generate(
Args:
prompt (str): The input prompt to generate text from.
prompts (Union[str, List[str]]): Single prompt string or a list of prompt strings.
sample_idx (int, optional): Index to differentiate between samples. Defaults to 0.
max_sequence_length (int, optional): Maximum length of the input sequence.
Defaults to 2048.
max_output_length (int, optional): Maximum length of the generated output.
Defaults to 128.
prompts (Union[str, List[str]]): Single prompt string or a list of prompt strings.
end_if_newline (bool, optional): If True, truncate the generation at the first newline.
Defaults to False.
end_if_second_newline (bool, optional): If True, truncate the generation at the second newline.
Expand Down

0 comments on commit f5ac6cd

Please sign in to comment.