Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference on a new dataset #20

Open
prateek-pt opened this issue Dec 7, 2020 · 1 comment
Open

Inference on a new dataset #20

prateek-pt opened this issue Dec 7, 2020 · 1 comment

Comments

@prateek-pt
Copy link

Is there a way by which I can test the given model on my dataset which has different tags with respect to the conll, etc? I want to use the pre-trained 'eng_conll03' model and fine tune it with my data. Like removing the final dense layer having 5 nodes with 10 nodes (as per my dataset tags).

@juntaoy
Copy link
Owner

juntaoy commented Dec 7, 2020

Basically what you need to do is do not read the biaffine part of the system, it is not a dense layer but a biaffine tensor.
You can add a method:

def restore_conll03(self, session):
    # Don't try to restore unused variables from the TF-Hub ELMo module.
    vars_to_restore = [v for v in tf.global_variables() if "module/" not in v.name and "Bilinear/" not in v.name]
    saver = tf.train.Saver(vars_to_restore)
    checkpoint_path = os.path.join(self.config["log_dir"], "model.max.ckpt")
    print("Restoring from {}".format(checkpoint_path))
    session.run(tf.global_variables_initializer())
    saver.restore(session, checkpoint_path)

Then you could do your fine-tuning with different number of NER types

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants