Model Settings

Connect to your LLM server and configure model parameters.

No model connected

Enter your Ollama server URL and connect to get started.

The language model used for generating responses.