Conversation
| import pytest | ||
| import torch | ||
| from metatomic.torch import ase_calculator | ||
| from metatrain.utils.io import load_model |
There was a problem hiding this comment.
this is the only bit that I am sus on how to correctly load a model here.
There was a problem hiding this comment.
# Load the model
if isinstance(model, (str, bytes, pathlib.PurePath)):
if not os.path.exists(model):
raise InputError(f"given model path '{model}' does not exist")
# only store the model in self.parameters if is it the path to a file
self.parameters["model"] = str(model)
model = load_atomistic_model(
model, extensions_directory=extensions_directory
)
elif isinstance(model, torch.jit.RecursiveScriptModule):
if model.original_name != "AtomisticModel":
raise InputError(
"torch model must be 'AtomisticModel', "
f"got '{model.original_name}' instead"
)
elif isinstance(model, AtomisticModel):
# nothing to do
pass
else:
raise TypeError(f"unknown type for model: {type(model)}")This is how models are loaded on the ase side
|
Thanks @CompRhys! We had a chat about this 1h ago, and plan to create a separate metatomic-torchsim (and metatomic-ase, …) python package that would live in this repository but have a separate version number and release schedule. This might take us a couple of weeks to do, how urgent is getting this code out of torchsim for you? @HaoZeke in the mean time can you help bringing this PR in a state where it supports all the same features as the other engine integrations? |
In the same vein, @CompRhys, would you be OK with me making changes to this PR? I'd like to setup the python package first.
Yup, in the package 😁 |
|
Not urgent. We just cut a 0.5.2 release and so will be a while till we cut 0.5.3. Hopefully this could land in that release. The big switch also relies on there being new releases with the upstreamed interfaces for Please feel free to take this over or delete! |
Move the torchsim model upstream.