Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vllm #265

Open
mathewpareles opened this issue Feb 4, 2025 · 4 comments
Open

vllm #265

mathewpareles opened this issue Feb 4, 2025 · 4 comments

Comments

@mathewpareles
Copy link
Contributor

No description provided.

@mathewpareles mathewpareles moved this to 🔎 Improvements (A) in 🚙 Void Roadmap Feb 4, 2025
@AayushSameerShah
Copy link

Looking forward to vLLM being integrated!!

@andrewpareles
Copy link
Contributor

You might want to try our OpenAI-compatible endpoint with baseURL = http://localhost:8000/v1 for now!

We plan on adding vLLM as a separate provider for clarity, but it will simply be routed through the OpenAI-Compatible endpoint which works right now.

@AayushSameerShah
Copy link

I see @andrewpareles
Tried it and it works; Thanks!

Now, waiting for Autocomplete 😅

@andrewpareles
Copy link
Contributor

will be in the upcoming release (this week)!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: 🔎 Improvements (A)
Development

No branches or pull requests

3 participants