Please install MindSpore 2.1 refer to MindSpore Install
Refer to Lite Install
-
Download the supporting tar.gz and whl packages according to the environment.
-
Unzip the tar.gz package and install the corresponding version of the WHL package.
tar -zxvf mindspore-lite-2.1.0-*.tar.gz pip install mindspore_lite-2.1.0-*.whl
-
Configure Lite's environment variables
LITE_HOMEis the folder path extracted from tar.gz, and it is recommended to use an absolute path.export LITE_HOME=/path/to/mindspore-lite-{version}-{os}-{platform} export LD_LIBRARY_PATH=$LITE_HOME/runtime/lib:$LITE_HOME/tools/converter/lib:$LD_LIBRARY_PATH export PATH=$LITE_HOME/tools/converter/converter:$LITE_HOME/tools/benchmark:$PATH
We provide a script for converting pre-trained weight from .safetensors to .ckpt in tools/model_conversion/convert_weight.py.
step1. Download the Official pre-train weights SDXL-base-1.0 from huggingface.
step2. Convert weight to MindSpore .ckpt format and put it to ./models/.
# convert sdxl-base-1.0 model
cd tools/model_conversion
python convert_weight.py \
--task pt_to_ms \
--weight_safetensors /PATH TO/sd_xl_base_1.0.safetensors \
--weight_ms /PATH TO/sd_xl_base_1.0_ms.ckpt \
--key_torch torch_key_base.yaml \
--key_ms mindspore_key_base.yamlexample as:
python export.py --task=text2img --model=./config/model/sd_xl_base_inference.yaml --n_samples=1Note: The MindIR file will be generated in output/[MODEL_NAME]-[TASK].
Please use converter_lite command to convert MindSpore MindIR to the MindSpore Lite model, example as:
converter_lite --fmk=MINDIR --saveType=MINDIR --optimize=ascend_oriented \
--modelFile=./output/[MODEL_NAME]-[TASK]/data_prepare_graph.mindir \
--outputFile=./output/[MODEL_NAME]-[TASK]/data_prepare_graph_lite \
--configFile=./config/lite/sd_lite.cfgNote: Lite model name ends with _lite.mindir
After all model conversions, run sd_lite_infer.py to generate images for the prompt of your interest, example as:
python sd_lite_infer.py --task=text2img --model=./config/model/sd_xl_base_inference.yaml \
--sampler=./config/schedule/euler_edm.yaml --sampling_steps=40 --n_iter=1 --n_samples=1 --scale=9.0Note: n_samples must be the same as the value in export.
Details
Online Infer in GETTING_STARTED.
Run sd_infer.py to generate images for the prompt of your interest.
python sd_infer.py --device_target=Ascend --task=text2img --model=./config/model/sd_xl_base_inference.yaml --sampler=./config/schedule/euler_edm.yaml --sampling_steps=40 --n_iter=5 --n_samples=1 --scale=9.0- device_target: Device target, default is Ascend.
- task: Task name, should be [text2img], if choose a task name, use the config/[task].yaml for inputs, default is text2img.
- model: Path to config which constructs the model. Must be set, you can select a yaml from ./inference/config/model.
- sampler: Infer sampler yaml path, default is ./config/schedule/euler_edm.yaml.
- sampling_steps: Number of sampling steps, default is 40.
- n_iter: Number of iterations or trials, default is 1.
- n_samples: How many samples to produce for each given prompt in an iteration. A.k.a. batch size, default is 1.
- scale: Unconditional guidance scale. General set 7.5 for v1.x, 9.0 for v2.x
Please run python sd_infer.py -h for details of command parameters.
The prompt, negative_prompt, and image_path, generate image height, generate image width, which could be set in config/[task].yaml.
You can get images at "output/samples".
| Model Name | Device | MindSpore | MindSpore Lite | CANN | Task | ImageSize | PerBatchSize | Sample Step | Time Per Image |
|---|---|---|---|---|---|---|---|---|---|
| sd_xl_base_1.0 | Ascend 910* | 2.2.10 | 2.2.10 | C15 | text2img | 1024*1024 | 1 | 40 | 6.172 s |
| sd_xl_base_1.0 | Ascend 910 | 2.1.0 | 2.1.0 | C15 | text2img | 1024*1024 | 1 | 40 | 17.20 s |
| sd_xl_base_1.0 | Ascend 310P | 2.1.0 | 2.1.0 | C15 | text2img | 1024*1024 | 1 | 40 | 118 s |
The sampler schedule is euler_edm.
| Device | Online Inference (MindSpore) | Offline Inference (Lite) |
|---|---|---|
| Ascend 910* | ✅ | ✅ |
| Ascend 910 | ✅ | ✅ |
| Ascend 310P | - | ✅ |