Guides
Migration from Other Providers
Some Grok users might have migrated from other LLM providers. The xAI API is designed to be compatible with both the OpenAI and Anthropic SDKs, except for certain capabilities not offered by the respective SDK. We recommend using the native xAI Python SDK or the Vercel AI SDK for JavaScript for the best experience and access to all features.
In two steps:
- At API client object construction, you need to set the "base URL" to
https://api.x.ai/v1and "API key" to your xAI API key (obtained from xAI Console). - When sending a message for inference, set "model" to be one of the Grok model names.
If you use third-party tools such as LangChain (JavaScript/Python) and Continue, they usually have a common base class for LLM providers. You only need to change the provider and API keys. You can refer to their documentation for case-by-case instructions.
Examples using OpenAI and Anthropic SDKs:
OpenAI SDK
from openai import OpenAI
client = OpenAI(
api_key=XAI_API_KEY,
base_url="https://api.x.ai/v1",
)
# ...
completion = client.chat.completions.create(
model="grok-4",
# ...
)Anthropic SDK (Deprecated)
Anthropic SDK Deprecated: The Anthropic SDK compatibility is fully deprecated. Please migrate to the Responses API or gRPC. We recommend using the native xAI Python SDK or the Vercel AI SDK for JavaScript.
from anthropic import Anthropic
client = Anthropic(
api_key=XAI_API_KEY,
base_url="https://api.x.ai",
)
# ...
message = client.messages.create(
model="grok-4",
# ...
)