-
Notifications
You must be signed in to change notification settings - Fork 429
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could not load / run a Flux model (v1.84 regression too) #1376
Comments
Yes, there's a bug with image generation this version, you need to disable the "AutoGuess" chat completions adapter. It will be fixed in the next version. |
Should be fixed in 1.84 |
Some models do load fine, and then I get models that have this problem: This is PixelWave FLUX.1-dev3 using the Hyper Flux Accelerator LORA |
I had to switch back to v1.83.1 because v1.84 seems has more problems with image generation (e.g. "The response could not be sent, maybe connection was terminated?" problem above). |
@phr00t I think the badmix model is defective, it doesn't load on earlier versions and I saw some other people have issues with it too. Something about the tensor dimensions are incorrect. As for the "Image Generation Failed", I tried pixelwave with https://civitai.com/api/download/models/992642?type=Model&format=GGUF&size=full&fp=nf4 and it works for me. can you try run with 1.84.1 and debugmode on to see what error is displayed? |
I'm back to using FluxFusion V2 and I'm not having trouble loading it or generating images with v1.84.1. I didn't see an option in the GUI or KoboldAI Lite for "debugmode", perhaps it is a command line argument? |
Sorry, can you try v1.84.2, i Just released a patch https://github.com/LostRuins/koboldcpp/releases/latest Also, debug mode can be enabled from the hardware panel |
https://civitai.com/models/1093711?modelVersionId=1229015
I'm trying to use the 10 step Q4_K_S model :(
KoboldCpp 1.83.1 (latest as of this issue)
The text was updated successfully, but these errors were encountered: