Examples and Snippets
Snippets for Featherless access through common tools and libraries
OpenAI SDK
1. Install the OpenAI SDK
meta-llama/Meta-Llama-3.1-8B-Instruct is an OpenAI compatible LLM so we will use the OpenAI SDK to run it.
Install the OpenAI package and import it into your code.
2. Configure the SDK Client
Change the Base URL and API Key to run your chat completions through Featherless.
3. Run your completions
Congratulations! You're can now use your open source LLM! Make sure to specify the model when you run a completion.
Python example
from openai import OpenAI
client = OpenAI(
base_url="https://api.featherless.ai/v1",
api_key="YOUR-API-KEY",
)
response = client.chat.completions.create(
model='meta-llama/Meta-Llama-3.1-8B-Instruct',
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
)
print(response.model_dump()['choices'][0]['message']['content'])
LangChain
You can use LangChain to connect to the model. Make sure to replace the placeholders with your actual values.
Install the LangChain package and import it into your code.
Python - Langchain
import os
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
api_key=os.getenv('FEATHERLESS_API_KEY')
base_url="https://api.featherless.ai/v1",
model="meta-llama/Meta-Llama-3.1-8B-Instruct",
)
messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
(
"human",
"I love programming."
),
]
ai_msg = llm.invoke(messages)
ai_msg
Curl Example
You can use cURL to connect to the model. Make sure to replace the placeholders with your actual values.
Curl Example
BASE_URL=https://api.featherless.ai/v1
MODEL_NAME=meta-llama/Meta-Llama-3.1-8B-Instruct
request_body=$(cat << JSON
{
"model": "$MODEL_NAME",
"prompt": "Question: What NFL team won the Super Bowl in the year Justin Bieber was born?\n\nAnswer: lets think step by step",
"max_tokens": 1000
}
JSON
)
curl \
-H "Authorization: Bearer $API_KEY" \
-H "Content-type: application/json" \
-d "$request_body" \
$BASE_URL/completions