Downloading and Running Models from PrunaAI on Huggingface

This guide provides step-by-step instructions for downloading and running models published on https://huggingface.co/PrunaAI.

Step 1: Install Pruna-Engine

  1. Install the pruna-engine from PyPI. Visit https://pypi.org/project/pruna-engine/ for detailed instructions.

  2. Ensure you download the pruna-engine version specified in the ReadMe of the model on Huggingface.

Step 2: Download the Model

Use the following Bash commands to download the model:

mkdir name-of-model
huggingface-cli download name-of-model --local-dir location-of-model-on-your-machine --local-dir-use-symlinks False

Replace name-of-model with the Huggingface repository name, and location-of-model-on-your-machine with your desired local directory.

For example, for the model at https://huggingface.co/PrunaAI/SimianLuo-LCM_Dreamshaper_v7-smashed, use PrunaAI/SimianLuo-LCM_Dreamshaper_v7-smashed as name-of-model.

Step 3: Load the Model

Install and import the pruna-engine package, then load the model using Python:

from pruna_engine.PrunaModel import PrunaModel

model_path = "location-of-model-on-your-machine"  # Specify the downloaded model path here.
smashed_model = PrunaModel.load_model(model_path)  # Load the model.

Step 4: Run the Model

Run the model using Python:

output = smashed_model(input)

Replace input with the appropriate input for the model. The model’s output will be stored in output.