Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Autoload shared libraries #78

Open
wants to merge 12 commits into
base: master
Choose a base branch
from
15 changes: 14 additions & 1 deletion conifer/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -532,7 +532,7 @@ def make_model(ensembleDict, config=None):
backend = get_backend(backend)
return backend.make_model(ensembleDict, config)

def load_model(filename, new_config=None):
def load_model(filename, shared_library=None, new_config=None):
pviscone marked this conversation as resolved.
Show resolved Hide resolved
'''
Load a Model from JSON file
pviscone marked this conversation as resolved.
Show resolved Hide resolved

Expand Down Expand Up @@ -561,4 +561,17 @@ def load_model(filename, new_config=None):

model = make_model(js, config)
model._metadata = metadata + model._metadata

pviscone marked this conversation as resolved.
Show resolved Hide resolved
try:
if config["backend"] in ["cpp","xilinxhls"]:
pviscone marked this conversation as resolved.
Show resolved Hide resolved
import importlib
last_timestamp=int(model._metadata[-2]._to_dict()["time"])
pviscone marked this conversation as resolved.
Show resolved Hide resolved
#look for the shared library in the same directory as the model json if not specified
shared_library_path=os.path.join(os.path.dirname(filename), f'conifer_bridge_{last_timestamp}.so') if shared_library is None else shared_library
spec = importlib.util.spec_from_file_location(f'conifer_bridge_{last_timestamp}', shared_library_path)
model.bridge = importlib.util.module_from_spec(spec).BDT(filename)
spec.loader.exec_module(model.bridge)
except Exception as e:
logger.warn("Was not able to load the shared library. Run model.compile(): ", e)
pviscone marked this conversation as resolved.
Show resolved Hide resolved

return model