-
Notifications
You must be signed in to change notification settings - Fork 357
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to inference using Segmind Tiny SD model #603
Comments
You don't need to convert the I think there are two problems here:
So to "fix" this you would first need to make sure the text_encoder is included in the model file, and then you'd still have to add support for this model (I think this would require making changes to |
Thanks for your reply @stduhpf , The tiny-sd model(https://huggingface.co/segmind/tiny-sd/tree/main) has text encoder: ![]() And hence the error is not because of the absence of text encoder, it would be something else. |
Yes, there is a text encoder on the hf repo, I already knew that (and that's how I knew it should be detected as a sd1.x model), I was only saying the text encoder might be missing from the safetensors file. But it's also possible it has been included under a different name than the one expected. |
Hi there,
I am trying to run Segmind's Distilled Diffusion model (segmind/tiny-sd) from Hugging Face on my machine with the following specifications:
I successfully converted the model to a single checkpoint file (.safetensors, .ckpt) as shown below:
Then, I used the convert function in stable-diffusion.cpp to convert the .safetensors file to the GGUF format. While the conversion completed without any errors, the resulting GGUF file is significantly smaller than the original model, as shown here:
When I attempt to run inference with the converted GGUF file, I encounter the following error:
It appears that the issue may lie in the conversion process to GGUF, possibly due to the fact that the model in question is a distilled version (Tiny-SD). I am wondering if anyone has worked with distilled models in this context and found a fix for this issue.
Any insights or suggestions would be greatly appreciated.
Thank you for your time and assistance!
The text was updated successfully, but these errors were encountered: