Adding a transformer based encoder-decoder model #2566
Annotations
10 errors
test:
flair/__init__.py#L1
mypy-status
mypy exited with status 1.
|
test:
flair/embeddings/document.py#L1
flair/embeddings/document.py
697: error: Argument "dtype" to "zeros" has incompatible type "Union[dtype, Tensor, Module]"; expected "Optional[dtype]" [arg-type]
|
test:
flair/embeddings/token.py#L1
flair/embeddings/token.py
1469: error: Item "Tensor" of "Union[Tensor, Module]" has no attribute "spm" [union-attr]
1469: error: Item "Tensor" of "Union[Any, Tensor, Module]" has no attribute "vocab_size" [union-attr]
1469: error: "Tensor" not callable [operator]
|
test:
flair/models/__init__.py#L1
Black format check
--- /home/runner/work/flair/flair/flair/models/__init__.py 2025-01-31 10:30:52.176591+00:00
+++ /home/runner/work/flair/flair/flair/models/__init__.py 2025-01-31 10:34:54.758855+00:00
@@ -37,7 +37,7 @@
"TARSTagger",
"TextClassifier",
"TextRegressor",
"MultitaskModel",
"CausalLanguageModelDecoder",
- "EncoderDecoderLanguageModel"
+ "EncoderDecoderLanguageModel",
]
|
test:
flair/models/encoder_decoder_model.py#L1
flair/models/encoder_decoder_model.py
32: error: "staticmethod" used with a non-method [misc]
103: error: Argument 1 to "tie_encoder_to_decoder_recursively" has incompatible type "Optional[Module]"; expected Module [arg-type]
104: error: Argument 2 to "tie_encoder_to_decoder_recursively" has incompatible type "Optional[Module]"; expected Module [arg-type]
138: error: Need type annotation for "local_scope" (hint: "local_scope: dict[<type>, <type>] = ...") [var-annotated]
359: error: Signature of "evaluate" incompatible with supertype "Model" [override]
359: note: Superclass:
359: note: def evaluate(self, data_points: Union[list[Any], Dataset[Any]], gold_label_type: str, out_path: Union[str, Path, None] = ..., embedding_storage_mode: Literal['none', 'cpu', 'gpu'] = ..., mini_batch_size: int = ..., main_evaluation_metric: tuple[str, str] = ..., exclude_labels: Optional[list[str]] = ..., gold_label_dictionary: Optional[Dictionary] = ..., return_loss: bool = ..., **kwargs: Any) -> Result
359: note: Subclass:
359: note: def evaluate(self, data_points: Union[list[DataPoint], Dataset[Any]], mini_batch_size: int = ..., **kwargs: Any) -> Result
|
test:
flair/models/multitask_model.py#L1
flair/models/multitask_model.py
210: error: "Tensor" not callable [operator]
|
test:
flair/models/sequence_tagger_model.py#L1
flair/models/sequence_tagger_model.py
866: error: Unexpected keyword argument "use_auth_token" [call-arg]
867: error: Item "None" of "Optional[list[RepoSibling]]" has no attribute "__iter__" (not iterable) [union-attr]
|
test:
flair/nn/multitask.py#L1
flair/nn/multitask.py
27: error: Argument "models" to "MultitaskModel" has incompatible type "list[Classifier[Any]]"; expected "list[Model[Any]]" [arg-type]
27: note: "List" is invariant -- see https://mypy.readthedocs.io/en/stable/common_issues.html#variance
27: note: Consider using "Sequence" instead, which is covariant
|
test:
tests/embeddings/test_transformer_document_embeddings.py#L37
test_if_loaded_embeddings_have_all_attributes[False]
ValueError: Could not find any model with name '/home/runner/work/flair/flair/tests/resources/tasks/single.pt'
|
test:
tests/embeddings/test_transformer_word_embeddings.py#L93
TestTransformerWordEmbeddings.test_transformer_jit_embeddings[False]
_pickle.UnpicklingError: Weights only load failed. In PyTorch 2.6, we changed the default value of the `weights_only` argument in `torch.load` from `False` to `True`. Re-running `torch.load` with `weights_only` set to `False` will likely succeed, but it can result in arbitrary code execution. Do it only if you got the file from a trusted source.
Please file an issue with the following so that we can make `weights_only=True` compatible with your use case: WeightsUnpickler error: Unsupported operand 149
Check the documentation of torch.load to learn more about types accepted by default with weights_only https://pytorch.org/docs/stable/generated/torch.load.html.
|