Select a model below and start chatting with your local AI
Settings
Enter the URL where Ollama is running
For local network access:
1. Set: http://YOUR_PC_IP:11434
2. Start Ollama with: OLLAMA_HOST=0.0.0.0 ollama serve
3. Or set environment variable: OLLAMA_ORIGINS=*