Model Loader
This utility is used by various Infernet workflows to load the models.
Usage
load_model(model_source: ModelSource, **kwargs)
For available model sources & their respective keyword arguments, see below:
Returns
The path to the model file on your local file system, after it's been downloaded.
Model Sources
Available model sources are:
class ModelSource(Enum):
"""
Enum for the model source
"""
LOCAL = "local"
ARWEAVE = "arweave"
HUGGINGFACE_HUB = "huggingface_hub"
Local
Loads the model from the local file system. This is a no-op and simply returns the path passed into it.
Keyword Arguments
model_path (str)
: Path to the model file.
Example
To load the model from the local file system:
from infernet_ml.utils.model_loader import load_model, ModelSource
load_model(
model_source=ModelSource.LOCAL,
model_path="path/to/model.torch"
)
Huggingface Hub
Loads the model from the Huggingface Hub. Under the hood it uses the hub client library (opens in a new tab) to download the models.
Keyword Arguments
repo_id (str)
: The repository id of the model.filename (str)
: The name of the model file to be downloaded.
Example
To load the model from the Huggingface Hub:
from infernet_ml.utils.model_loader import load_model, ModelSource
load_model(
model_source=ModelSource.HUGGINGFACE_HUB,
repo_id="username/repo_id",
filename="model.torch"
)
Arweave
Loads the model from the Arweave. The input format to this mimics that of huggingface hub, except that to be able to download the model from the Arweave, you need to be an owner of the model.
Keyword Arguments
repo_id (str)
: The repository id of the model.filename (str)
: The name of the model file to be downloaded.owners (list[str])
: A list of Arweave wallet addresses that are the owners of the model.
Example
To load the model from the Arweave:
from infernet_ml.utils.model_loader import load_model, ModelSource
load_model(
model_source=ModelSource.ARWEAVE,
repo_id="username/repo_id",
filename="model.torch",
owners=["wallet_address"]
)