VercelVercel
Menu

Getting started with AI Gateway using a coding agent

Last updated February 24, 2026

If you use a coding agent like Claude Code, Cursor, or Cline, you can get started with AI Gateway by prompting your agent directly instead of writing each file by hand.

For step-by-step manual setup, see the manual quickstart.

Go to the AI Gateway API Keys page in your Vercel dashboard and click Create key to generate a new API key. Save the key as AI_GATEWAY_API_KEY in your environment.

Give your agent up-to-date knowledge of the AI SDK by installing the AI SDK skill:

Terminal
npx skills add vercel/ai --skill ai-sdk

This works with Claude Code, Cursor, Cline, and 18+ other agents. The skill ensures your agent uses current AI SDK APIs rather than outdated patterns.

The fastest way to confirm your API key is working is to have your agent make a single request. Copy this prompt into your agent:

Prompt
Make a request to the Vercel AI Gateway to verify my API key works.

- Use cURL to POST to https://ai-gateway.vercel.sh/v1/responses
- Authenticate with a Bearer token using my AI_GATEWAY_API_KEY env var
- Use the model "anthropic/claude-sonnet-4.6"
- Send the prompt: "Invent a new holiday and describe its traditions."
- Run it and show me the response.

Your agent will run something like:

Terminal
curl -X POST "https://ai-gateway.vercel.sh/v1/responses" \
  -H "Authorization: Bearer $AI_GATEWAY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "anthropic/claude-sonnet-4.6",
    "input": [
      {
        "type": "message",
        "role": "user",
        "content": "Invent a new holiday and describe its traditions."
      }
    ]
  }'

If you see a model response, your API key and AI Gateway are working.

To create a full project that uses AI Gateway with the AI SDK, prompt your agent:

Prompt
Create a TypeScript project that uses the AI SDK to stream a response
from Vercel AI Gateway.

- Initialize with pnpm and install the `ai` package, dotenv,
  @types/node, tsx, and typescript
- Store the API key in .env.local as AI_GATEWAY_API_KEY
- Use streamText with the model "openai/gpt-5.2"
- Stream the output to stdout, then log token usage and finish reason
- Run it with tsx to verify it works

Your agent will create the project, install dependencies, write the code, and run it.

If you already have a project, prompt your agent to add AI Gateway:

Prompt
Add AI Gateway to this project using the AI SDK.

- Install the `ai` package if not already installed
- Use my AI_GATEWAY_API_KEY from .env.local
- Models use the format "provider/model", for example "openai/gpt-5.2"

Your agent will determine where and how to integrate based on your project's structure and framework.

See available models for the full list of supported model strings.

DetailValue
OpenResponses endpointhttps://ai-gateway.vercel.sh/v1/responses
OpenAI-compatible endpointhttps://ai-gateway.vercel.sh/v1
Anthropic-compatible endpointhttps://ai-gateway.vercel.sh
Auth headerAuthorization: Bearer <AI_GATEWAY_API_KEY>
Model formatprovider/model (e.g., openai/gpt-5.2, anthropic/claude-sonnet-4.6)
Env variableAI_GATEWAY_API_KEY
AI SDK packageai (uses AI Gateway automatically with model strings)

Was this helpful?

supported.