Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: logprobs from LLM #261

Closed
micpst opened this issue Dec 19, 2024 · 0 comments
Closed

feat: logprobs from LLM #261

micpst opened this issue Dec 19, 2024 · 0 comments
Labels
feature New feature or request
Milestone

Comments

@micpst
Copy link
Collaborator

micpst commented Dec 19, 2024

Feature description

The LLM interface should return logprobs along with the generated response, most APIs support this feature, but the current ragbits implementation simply ignores this type of output and focuses only on the generated response.

Motivation

I want to create an LLM-based cross-encoder, similar to the one from OpenAI cookbook. To do this I need access to the logprobs from generated response, with the current implementation I cannot implement this cookbook.

Additional context

No response

@micpst micpst added the feature New feature or request label Dec 19, 2024
@micpst micpst closed this as completed Jan 17, 2025
@github-project-automation github-project-automation bot moved this to Done in ragbits Jan 17, 2025
@mhordynski mhordynski added this to the Ragbits 0.5 milestone Feb 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
Status: Done
Development

No branches or pull requests

2 participants