Back to Services
Endpoints
OpenAI SDK
Using OnlySq AI models via OpenAI SDK. The API is compatible for calls from OpenAI libraries. This allows you to easily connect projects to the OnlySq API, use any of the 40+ available models for free, use it in projects such as Exteragram, etc.
Supported Languages
TypeScript / JavaScriptPythonJava.NETGo
Installation
First you need to install the required dependencies and import SDK.
python
pip install openaiInitializing Client
python
from openai import OpenAI
client = OpenAI(
base_url="https://api.onlysq.ru/ai/openai/",
api_key="openai" # or your valid key
)Chat Completions
Here's a basic example of using Chat Completions.
python
from openai import OpenAI
client = OpenAI(
base_url="https://api.onlysq.ru/ai/openai",
api_key="openai",
)
completion = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{
"role": "user",
"content": "Say 5 short facts about AI.",
},
],
)
print(completion.choices[0].message.content)Example Response
Here are five short facts about AI:
- AI Learns from Data – AI systems improve by analyzing large amounts of data, identifying patterns, and making predictions.
- Narrow vs. General AI – Most AI today is "narrow" (task-specific), while "general AI" (human-like reasoning) is still theoretical.
- AI Powers Everyday Tech – Virtual assistants (Siri, Alexa), recommendations (Netflix, Spotify), and spam filters all use AI.
- Ethical Concerns Exist – AI raises issues like bias in algorithms, job displacement, and privacy risks.
- AI is Evolving Fast – Breakthroughs in deep learning and generative AI (like ChatGPT) are rapidly advancing the field.
State Management
Use the messages parameter to build conversation history. Include system (developer) messages and multi-turn chat between user and assistant.
python
from openai import OpenAI
client = OpenAI(
base_url="https://api.onlysq.ru/ai/openai",
api_key="openai",
)
completion = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{
"role": "system",
"content": "You are a neko-helper.",
},
{
"role": "user",
"content": "What's 5 + 5?",
},
{
"role": "assistant",
"content": "Nya~! 5 + 5 is 10, purr-fectly simple! 😸✨"
},
{
"role": "user",
"content": "Say what do you like",
},
],
)
print(completion.choices[0].message.content)Streaming Chat Completions
python
from openai import OpenAI
client = OpenAI(api_key="openai", base_url="https://api.onlysq.ru/ai/openai/")
messages = [
{
"role": "user",
"content": "Write a one-line story about AI."
}
]
r = client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
stream=True
)
for chunk in r:
print(chunk.choices[0].delta.content, end="")Streaming Chunk Example
json
{
"id": "chatcmpl_lQdhTfXn4yKoDlyCjuBpL0GfPZNonZufGqOyGl3wVpMhJRwP",
"object": "chat.completion.chunk",
"created": 1750612428,
"model": "gpt-4o-mini",
"choices": [
{
"index": 0,
"delta": {
"content": "te",
"role": "assistant"
},
"finish_reason": null
}
],
"usage": null
}Supported Parameters
model- requiredmessages- required