You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The LLM interface should return logprobs along with the generated response, most APIs support this feature, but the current ragbits implementation simply ignores this type of output and focuses only on the generated response.
Motivation
I want to create an LLM-based cross-encoder, similar to the one from OpenAI cookbook. To do this I need access to the logprobs from generated response, with the current implementation I cannot implement this cookbook.
Additional context
No response
The text was updated successfully, but these errors were encountered:
Feature description
The LLM interface should return logprobs along with the generated response, most APIs support this feature, but the current ragbits implementation simply ignores this type of output and focuses only on the generated response.
Motivation
I want to create an LLM-based cross-encoder, similar to the one from OpenAI cookbook. To do this I need access to the logprobs from generated response, with the current implementation I cannot implement this cookbook.
Additional context
No response
The text was updated successfully, but these errors were encountered: