-
Notifications
You must be signed in to change notification settings - Fork 626
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
vllm #265
Comments
Looking forward to vLLM being integrated!! |
You might want to try our OpenAI-compatible endpoint with baseURL = http://localhost:8000/v1 for now! We plan on adding vLLM as a separate provider for clarity, but it will simply be routed through the OpenAI-Compatible endpoint which works right now. |
I see @andrewpareles Now, waiting for Autocomplete 😅 |
will be in the upcoming release (this week)! |
No description provided.
The text was updated successfully, but these errors were encountered: