Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases #30

Open
SEUNKOREA opened this issue Jan 3, 2025 · 1 comment

Comments

@SEUNKOREA
Copy link

아래의 링크에 있는 코드에서
https://github.com/tabtoyou/KoLLaVA/blob/main/KoLLaVA-v1-Kovicuna-7b_inference.ipynb]

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
import os
from llava.conversation import conv_templates, SeparatorStyle
from llava.utils import disable_torch_init
from transformers import CLIPVisionModel, CLIPImageProcessor, StoppingCriteria
from llava.model import *
from llava.model.utils import KeywordsStoppingCriteria

실행하면 다음과 같은 오류가 발생합니다.

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[11], line 4
      2 import torch
      3 import os
----> 4 from llava.conversation import conv_templates, SeparatorStyle
      5 from llava.utils import disable_torch_init
      6 from transformers import CLIPVisionModel, CLIPImageProcessor, StoppingCriteria

File /home/share/LSE/hde-intg/KoLLaVA/llava/__init__.py:1
----> 1 from .model import LlavaLlamaForCausalLM

File /home/share/LSE/hde-intg/KoLLaVA/llava/model/__init__.py:1
----> 1 from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig
      2 from .language_model.llava_mpt import LlavaMPTForCausalLM, LlavaMPTConfig

File /home/share/LSE/hde-intg/KoLLaVA/llava/model/language_model/llava_llama.py:40
     36     def __init__(self, config: LlamaConfig):
     37         super(LlavaLlamaModel, self).__init__(config)
---> 40 class LlavaLlamaForCausalLM(LlamaForCausalLM, LlavaMetaForCausalLM):
     41     config_class = LlavaConfig
     43     def __init__(self, config):

TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
@SEUNKOREA
Copy link
Author

커널 재시작 후에 위의 에러는 사라졌는데 아래와 같은 에러가 발생했습니다.

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[3], line 4
      2 import torch
      3 import os
----> 4 from llava.conversation import conv_templates, SeparatorStyle
      5 from llava.utils import disable_torch_init
      6 from transformers import CLIPVisionModel, CLIPImageProcessor, StoppingCriteria

File /home/share/LSE/hde-intg/KoLLaVA/llava/__init__.py:1
----> 1 from .model import LlavaLlamaForCausalLM

File /home/share/LSE/hde-intg/KoLLaVA/llava/model/__init__.py:1
----> 1 from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig
      2 from .language_model.llava_mpt import LlavaMPTForCausalLM, LlavaMPTConfig

File /home/share/LSE/hde-intg/KoLLaVA/llava/model/language_model/llava_llama.py:110
    107             _inputs['images'] = images
    108         return _inputs
--> 110 AutoConfig.register("llava", LlavaConfig)
    111 AutoModelForCausalLM.register(LlavaConfig, LlavaLlamaForCausalLM)

File ~/envs/hde-intg-dev-3.10/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:1074, in AutoConfig.register(model_type, config, exist_ok)
   1068 if issubclass(config, PretrainedConfig) and config.model_type != model_type:
   1069     raise ValueError(
   1070         "The config you are passing has a `model_type` attribute that is not consistent with the model type "
   1071         f"you passed (config has {config.model_type} and you passed {model_type}. Fix one of those so they "
   1072         "match!"
   1073     )
-> 1074 CONFIG_MAPPING.register(model_type, config, exist_ok=exist_ok)

File ~/envs/hde-intg-dev-3.10/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:773, in _LazyConfigMapping.register(self, key, value, exist_ok)
    769 """
    770 Register a new configuration in this mapping.
    771 """
    772 if key in self._mapping.keys() and not exist_ok:
--> 773     raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.")
    774 self._extra_content[key] = value

ValueError: 'llava' is already used by a Transformers config, pick another name.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant