-
Notifications
You must be signed in to change notification settings - Fork 190
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hugging face token-classification output causes non JSON serializable error #904
Comments
I think I have the fix. Will open an PR shortly. |
From the exception, That image used an old version code, not the latest. May the pr is not released |
@pepesi I'm not sure what you mean? The image I used is the latest tagged release, from a week ago and #692 was merged in that version of the code. What's more is I was able to reproduce locally when latest code on the main branch . Please see the MR I have opened to resolve the exception being thrown for |
That code is not released on branch release/1.2.x, you can view the code on here |
Ahh ok, I see now it wasn't in the tag but there is still an issue with master. Toke-classification output is still wrong. These changes are needed https://github.com/SeldonIO/MLServer/pull/905/files |
Not sure what happened, could you show more detail? |
I gave steps to reproduce in the PR. Load the model mentioned here or any token-classification model and send it a request. The PR fixes the issue, numpy floats were not being converted before being serialized to json. |
I think the pr #905 may have no effect on the problem, and the key is that you used an old version code. because when encoding a response, content_type in metadata will not take effect, just find codec by the content, not content_type. (code like below) MLServer/mlserver/codecs/utils.py Lines 90 to 104 in a685106
Could you please just use the clean master code on your local and try it again? |
I can confirm that @pepesi's PR didn't make it to the So, just to double check, did @pepesi's suggestion sort out your issue? PS: Thanks a lot for following up on this one @pepesi 👍 |
Yes it did, thanks. |
Looks like there was a recent MR to support all types of hugginface inputs/outputs however token-classification outputs are failing for me. I was able to reproduce with a couple token-classification models, cmarkea/distilcamembert-base-ner for example threw the following error.
Using docker image index.docker.io/seldonio/mlserver@sha256:7f7806e8ed781979bb5ef4d7774156a31046c8832d76b57403127add33064872
The text was updated successfully, but these errors were encountered: