Quickstart guide

Learn how to use the Featherless API to build with any open model.

Prerequisites

Our API interface is OpenAI compatible, meaning any client program that works with OpenAI as an AI/inference provider can be reconfigured to use featherless with little effort

  1. Sign up for an account at Featherless.

  2. Get your API key from the dashboard.

  3. Make your first API call:

OpenAI SDK - Python
from openai import OpenAI

client = OpenAI(
  base_url="https://api.featherless.ai/v1",
  api_key="FEATHERLESS_API_KEY",
)

response = client.chat.completions.create(
  model='meta-llama/Meta-Llama-3.1-8B-Instruct',
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Hello!"}
  ],
)
print(response.model_dump()['choices'][0]['message']['content'])

You can also make direct requests to our API endpoints (most important of which being /completions and /chat/completions) to integrate Featherless into any software application.

Featherless API - Python
import requests

response = requests.post(
    url="https://api.featherless.ai/v1/chat/completions",
    headers={
        "Content-Type": "application/json",
        "Authorization": "Bearer FEATHERLESS_API_KEY"
    },
    json={
        "model": "meta-llama/Meta-Llama-3.1-8B-Instruct",
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user","content": "Hello! How are you?"}
        ]
    }
)
print(response.json()["choices"][0]["message"]["content"])

What’s next?

Now that you’ve made your first Featherless AI API request, have a look at the following:

Last edited: Jun 6, 2025