Skip to main content
Use this page when you need to choose model providers, install the right extras, and decide whether presets or custom configs should be the starting point.

Provider Coverage

TypeMain options
Hosted providersOpenAI, Anthropic, Groq, Together, Hugging Face
Local or self-hostedOllama, vLLM
Framework-mediatedVercel AI SDK, LangChain integrations

Practical Guidance

  • Start with the provider you already use in production.
  • Add a second provider when you need cost, resilience, or model specialization.
  • Use local or self-hosted providers when deployment control matters more than turnkey access.
  • Use presets for speed, then move to custom configuration only when the workload demands it.

Install Paths

Important Examples