-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added text ai wrapper interface #166
base: main
Are you sure you want to change the base?
Added text ai wrapper interface #166
Conversation
language_alias: str = LANGUAGE_ALIAS, | ||
run_deploy_container: bool = True, | ||
run_deploy_scripts: bool = True, | ||
run_upload_models: bool = True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't have scripts to deploy. Neither do we have any models to upload upfront.
But we probably need the BucketFS credentials and HF token, just like the TE.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes but we plan on having scripts and default models in the future, and want to prevent having to change the interface to often.
i am not sure about the HF token. that one is for private models, and i doubt we will have private models as default models for text ai, which are the only ones this installer will use. @tkilias what do you think?
Bucketfs credential you might be right. i thought those would be given through the secret store? is this not the case?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With regards to the private token, as far as I am concerned, there is no difference between the Text AI and TE.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
User might use private models, so the token is needed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Regarding bucketfs credentials that comes from the AI Lab config
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We will have scripts and default models. Scripts will be added in Q2 in form of span functions and default models, we actually will need already this quarter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
having the HF token here still seems kinda iffy to me. this call does not need them. unless we allow users to set default model? which seem not default then.
otherwise if people want to install additional private models, should that not happen via a different call, which then can also be used to set the token?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can also remove the hf token, have no strong feelings about it
pass | ||
|
||
def initialize_text_ai_extension(conf: Secrets, | ||
container_file: Path, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe make container file and version optional, this way we easily can switch between them
fixes #164