-
Notifications
You must be signed in to change notification settings - Fork 953
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Flux Training: AttributeError: 'T5EncoderModel' object has no attribute 'text_model' #1451
Comments
|
Thanks for response. I have updated the dependencies but still not working. It seems the environment is fine but the T5 model structure is not correctly configured. For T5, I use the t5xxl_fp16.safetensors from https://huggingface.co/stabilityai/stable-diffusion-3-medium/tree/main/text_encoders. Dislike Clip-l, transformers.T5EncoderModel does not have the "text_model" attribute but "encoder". |
Could you please share the full stack trace of the error? |
Error msg and commands (T5 printed):
100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 100/100 [00:00<00:00, 2129088.32it/s] Pip: |
It looks like you're using an old version of the source code. Please git pull to the latest version. |
It works now. The key is to ensure cache_text_encoder_outputs=True while only set cache_text_encoder_outputs_to_disk=True will cause error. When cache_text_encoder_outputs=False and T5 is loaded on GPU, the following lines will cause error:
|
Thank you for reporting the issue. I will set cache_text_encoder_outputs to True when cache_text_encoder_outputs_to_disk=True. |
I updated the code. Thanks! |
The flux inference script is fine, but the training script will cause the following error:
AttributeError: 'T5EncoderModel' object has no attribute 'text_model'.
Does anyone encounter the same issue?
The text was updated successfully, but these errors were encountered: