Docs
Python SDK
Python SDK
Official Python SDK for the Lumnis AI API
Overview
The official Python SDK for Lumnis AI provides a simple, intuitive interface for building AI-powered applications. Full type hints and async support included.
Installation
pip install lumnisaiQuick Start
from lumnisai import Client, display_progress
# Initialize the client
client = Client(api_key="your-api-key")
# Create a simple response
response = client.invoke("What is the meaning of life?")
print(response.output_text)
# With streaming
for update in client.invoke("Explain quantum computing", stream=True):
display_progress(update)
if update.state == "completed":
print(f"\n{update.output_text}")Configuration
Environment Variables
export LUMNISAI_API_KEY="your-api-key"
export LUMNISAI_TENANT_ID="your-tenant-id" # OptionalClient Initialization
from lumnisai import Client
# Using API key directly
client = Client(api_key="your-api-key")
# Using environment variables (LUMNISAI_API_KEY)
client = Client()
# Custom configuration
client = Client(
api_key="your-api-key",
timeout=60.0, # 60 second timeout
max_retries=3 # Retry up to 3 times
)Async Support
All methods have async equivalents using AsyncClient:
import asyncio
from lumnisai import AsyncClient, display_progress
async def main():
client = AsyncClient(api_key="your-api-key")
# Create response asynchronously
response = await client.invoke("Hello!")
print(response.output_text)
# Streaming with display_progress
async for update in await client.invoke("Analyze this data", stream=True):
display_progress(update)
if update.state == "completed":
print(f"\n{update.output_text}")
asyncio.run(main())Using AsyncClient in Jupyter/Colab
from lumnisai import AsyncClient, display_progress
# No need for asyncio.run() in notebooks
client = AsyncClient(api_key="your-api-key")
# Direct await
response = await client.invoke("What are the latest AI trends?")
print(response.output_text)
# Streaming with display_progress
async for update in await client.invoke("Research topic", stream=True):
display_progress(update)Context Managers
from lumnisai import Client
# Automatic cleanup
with Client(api_key="your-api-key") as client:
response = client.invoke("Hello!")
print(response.output_text)
# Async version
async with AsyncClient(api_key="your-api-key") as client:
response = await client.invoke("Hello!")
print(response.output_text)User-Scoped Operations
from lumnisai import Client
client = Client(api_key="your-api-key")
# Create a user-scoped client
user_client = client.for_user("user@example.com")
response = user_client.invoke("What's the weather?")
# Or use as context manager
with client.as_user("user@example.com") as user_client:
response = user_client.invoke("What's the weather?")Next Steps
Explore the different SDK capabilities:
- AI Responses - Create, stream, and manage AI responses
- User Management - Create and manage users
- File Management - Upload, search, and manage files
- Integrations - Connect to external services
- MCP Servers - Manage Model Context Protocol servers
- Model Preferences - Configure AI models
- External API Keys - Manage BYO keys
- Advanced Usage - Error handling, testing, and more
Support
- Documentation: https://docs.lumnis.ai
- GitHub: https://github.com/Lumnis-AI/lumnisai-python
- PyPI: https://pypi.org/project/lumnisai/
- Support Email: support@lumnis.ai
License
MIT License © Lumnis AI