Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add an adaptation for a local model of LM Studio #271

Open
kongzhilv opened this issue Feb 5, 2025 · 2 comments
Open

add an adaptation for a local model of LM Studio #271

kongzhilv opened this issue Feb 5, 2025 · 2 comments

Comments

@kongzhilv
Copy link

** I want to use the better debugging of local models in lm studio to make my embedded code generation more targeted**

@andrewpareles andrewpareles moved this to 🔎 Improvements (A) in 🚙 Void Roadmap Feb 6, 2025
@devlux76
Copy link

devlux76 commented Feb 9, 2025

Is there any reason you wouldn't just use the OpenAI compatible endpoint provided by lm studio? I know on ollama this is :11434/v1 I'm pretty sure on lm studio it's on a different port but is still there.

@andrewpareles
Copy link
Contributor

I agree with @devlux76 - @kongzhilv can you try using the OpenAI-Compatible endpoint?
Happy to add an explicit endpoint for LM Studio just for clarity, but it would be through OpenAI-Compatible so want to make sure that works first.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: 🔎 Improvements (A)
Development

No branches or pull requests

3 participants