The LLM is whatever you want it to be. Self hosted or from any provider with a compatible endpoint. It’s likely a proprietary one… Because the cost of training LLMs means most are proprietary ones.
Looks like it supports locally hosted models as well, such as via Ollama: https://docs.openclaw.ai/providers. For anyone who actually wants something like this, at least there’s a way to self-host it 100%.
Yeah, but if I understand that correctly, that’s just for the app itself the LLM is very likely still a proprietary one (ChatGPT, Grok,…)
The LLM is whatever you want it to be. Self hosted or from any provider with a compatible endpoint. It’s likely a proprietary one… Because the cost of training LLMs means most are proprietary ones.
Looks like it supports locally hosted models as well, such as via Ollama: https://docs.openclaw.ai/providers. For anyone who actually wants something like this, at least there’s a way to self-host it 100%.