Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

issues running ollama #197

Open
Skruller01 opened this issue Feb 20, 2025 · 1 comment
Open

issues running ollama #197

Skruller01 opened this issue Feb 20, 2025 · 1 comment

Comments

@Skruller01
Copy link

Image

Image
The error shown here does not make any sense to me, even though i have installed llama on my windows. This problem is not just with llama3.2 but with llama2(as in problem is arising with lower version model)

@ed-donner
Copy link
Owner

Hey @Skruller01
The error with missing 'message' in the response is happening because of the error in the prior cell. It appears that your network (DNS / VPN / firewall) isn't allowing you to download the model weights from ollama. You could also try running just "ollama pull llama3.2" from the command line and you'll likely see the same error. If you're running in a corporate environment, would you be able to contact IT support? It would also be possible to use the Colab that I link in the README to download the model weights on to colab, and then move them to your PC. let me know!
Ed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants