Replies: 1 comment
-
|
I checked and it still seems to work with local models for me. Your config looks correct. I tested it with ollama/llama3. Unfortunately my machine can't run the model you're trying so I can't check that. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, I just find this fantastic TUI. But it seems like the Ollama models can not be loaded?

This is what I added to the config.toml & Ollama is running on the server mode.
But I still cant load the model to Edia.
Anything wrong the my configuration maybe?
Beta Was this translation helpful? Give feedback.
All reactions