Provider Coverage
| Type | Main options |
|---|---|
| Hosted providers | OpenAI, Anthropic, Groq, Together, Hugging Face |
| Local or self-hosted | Ollama, vLLM |
| Framework-mediated | Vercel AI SDK, LangChain integrations |
Practical Guidance
- Start with the provider you already use in production.
- Add a second provider when you need cost, resilience, or model specialization.
- Use local or self-hosted providers when deployment control matters more than turnkey access.
- Use presets for speed, then move to custom configuration only when the workload demands it.