Python SDK
Official Python SDK for the Lumnis AI platform
Lumnis AI Python SDK
Welcome to the comprehensive documentation for the Lumnis AI Python SDK. This guide covers everything you need to build AI-powered applications with Lumnis AI's multi-tenant platform.
Table of Contents
- Introduction
- Installation & Setup
- Quick Start Guide
- Core Concepts
- API Reference
- Examples & Tutorials
- Best Practices
- Troubleshooting
Introduction
The Lumnis AI Python SDK is the official client library for interacting with the Lumnis AI platform. It provides a powerful, type-safe interface for building agent-oriented applications with support for multiple AI providers.
Key Features
- 🏢 Multi-tenant Architecture: Build applications that serve multiple organizations with proper data isolation
- 👥 User Management: Complete user lifecycle management with cascade operations
- 🤖 Multiple AI Providers: Seamless integration with OpenAI, Anthropic, Google, Azure, and more
- ⚡ Async/Sync Support: Choose between synchronous and asynchronous programming models
- 💬 Conversation Threads: Maintain context across multiple interactions
- 📊 Real-time Progress: Track AI agent progress with streaming updates
- 🔒 Type Safety: Full type hints and Pydantic models for robust development
- 🛡️ Comprehensive Error Handling: Detailed exceptions for debugging
Why Lumnis AI?
Lumnis AI simplifies building production-ready AI applications by providing:
- Enterprise-ready infrastructure with built-in multi-tenancy
- Unified API across different AI providers
- Automatic rate limiting and retry logic
- Built-in security with proper scope isolation
- Progress tracking for long-running AI tasks
Installation & Setup
Requirements
- Python 3.8 or higher
- An active Lumnis AI account and API key
Installation
Install the SDK using pip:
pip install lumnisai
Configuration
Environment Variables
Create a .env
file or set these environment variables:
# Required
export LUMNISAI_API_KEY="your-api-key-here"
# Optional (with defaults)
export LUMNISAI_BASE_URL="https://api.lumnis.ai"
export LUMNISAI_TENANT_ID="your-tenant-id" # Auto-detected from API key
Getting Your API Key
- Sign up at lumnis.ai
- Navigate to Settings → API Keys
- Create a new API key with appropriate permissions
- Copy the key and store it securely
Quick Start Guide
Basic Usage
import lumnisai
# Initialize client (defaults to user scope)
client = lumnisai.Client()
# Simple AI interaction
response = client.invoke(
"Explain quantum computing in simple terms",
user_id="user-123"
)
print(response.output_text)
Async Usage
import asyncio
import lumnisai
async def main():
client = lumnisai.AsyncClient()
response = await client.invoke(
"What are the latest AI trends?",
user_id="user-123"
)
print(response.output_text)
await client.close()
asyncio.run(main())
Streaming Responses
async def stream_example():
client = lumnisai.AsyncClient()
async for update in await client.invoke(
"Research renewable energy trends",
stream=True,
user_id="user-123"
):
print(f"{update.state}: {update.message}")
if update.state == "completed":
print(f"Result: {update.output_text}")
asyncio.run(stream_example())
Core Concepts
1. Multi-Tenant Architecture
Lumnis AI is designed for multi-tenant applications where each tenant (organization) can have multiple users.
┌─────────────────────────────────────┐
│ Lumnis AI Platform │
├─────────────────────────────────────┤
│ ┌─────────────┐ ┌─────────────┐ │
│ │ Tenant A │ │ Tenant B │ │
│ ├─────────────┤ ├─────────────┤ │
│ │ • User 1 │ │ • User 1 │ │
│ │ • User 2 │ │ • User 2 │ │
│ │ • User 3 │ │ • User 3 │ │
│ └─────────────┘ └─────────────┘ │
└─────────────────────────────────────┘
2. Scopes: User vs Tenant
User Scope (Default - Recommended)
- Operations are limited to a specific user
- Provides data isolation and security
- Requires
user_id
parameter - Best for customer-facing applications
Tenant Scope (Admin Only)
- Operations affect entire tenant
- Access to all users' data
- Requires admin permissions
- Best for admin dashboards
# User scope (default)
client = lumnisai.Client(scope=lumnisai.Scope.USER)
response = client.invoke("Hello", user_id="user-123")
# Tenant scope
admin_client = lumnisai.Client(scope=lumnisai.Scope.TENANT)
all_users = admin_client.list_users()
3. User-Scoped Clients
Three ways to handle user context:
# Method 1: Pass user_id each time
client.invoke("Hello", user_id="user-123")
# Method 2: Create persistent user client
user_client = client.for_user("user-123")
user_client.invoke("Hello")
# Method 3: Temporary user context
with client.as_user("user-123") as user_client:
user_client.invoke("Hello")
4. Progress Tracking
Monitor long-running AI tasks:
# Automatic progress printing
response = await client.invoke(
"Complex research task",
user_id="user-123",
progress=True
)
# Manual progress handling
async for update in await client.invoke(
"Complex task",
stream=True,
user_id="user-123"
):
if update.state == "processing":
print(f"Progress: {update.message}")
API Reference
Client Classes
lumnisai.Client
Synchronous client for the Lumnis AI API.
client = lumnisai.Client(
api_key: str = None, # API key (or from env)
base_url: str = None, # API endpoint
tenant_id: str = None, # Tenant ID (auto-detected)
timeout: float = 30.0, # Request timeout
max_retries: int = 3, # Retry attempts
scope: Scope = Scope.USER # Operation scope
)
lumnisai.AsyncClient
Asynchronous client with the same interface as Client
.
async_client = lumnisai.AsyncClient(
# Same parameters as Client
)
Core Methods
invoke()
Main method for AI interactions.
response = client.invoke(
prompt: Union[str, Dict, List[Dict]], # Message(s)
user_id: Optional[str] = None, # Required for user scope
thread_id: Optional[str] = None, # Conversation thread
stream: bool = False, # Enable streaming
progress: bool = False, # Print progress
idempotency_key: Optional[str] = None, # Prevent duplicates
**kwargs # Additional parameters
)
Returns:
- If
stream=False
:ResponseObject
with final result - If
stream=True
:AsyncIterator[ResponseObject]
with updates
User Management
Create User
user = client.create_user(
email: str, # Unique email
first_name: Optional[str] = None,
last_name: Optional[str] = None
)
Get User
# By ID
user = client.get_user("550e8400-e29b-41d4-a716-446655440000")
# By email
user = client.get_user("alice@example.com")
Update User
user = client.update_user(
user_id: str,
email: Optional[str] = None,
first_name: Optional[str] = None,
last_name: Optional[str] = None
)
List Users
response = client.list_users(
page: int = 1,
page_size: int = 10,
search: Optional[str] = None
)
for user in response.users:
print(f"{user.email}: {user.first_name} {user.last_name}")
Delete User
# Cascades to all user data
client.delete_user(user_id: str)
Thread Management
Create Thread
thread = client.create_thread(
user_id: str,
title: Optional[str] = None,
metadata: Optional[Dict] = None
)
Get Thread
thread = client.get_thread(
thread_id: str,
user_id: Optional[str] = None # Required for user scope
)
List Threads
threads = client.list_threads(
user_id: Optional[str] = None, # Required for user scope
limit: int = 10,
offset: int = 0
)
Response Management
Get Response
response = client.get_response(
response_id: str,
wait: Optional[float] = None # Wait for completion
)
List Responses
responses = client.list_responses(
user_id: Optional[str] = None,
thread_id: Optional[str] = None,
limit: int = 10,
offset: int = 0
)
Cancel Response
cancelled = client.cancel_response(response_id: str)
External API Keys
Add API Key
client.add_api_key(
provider: ApiProvider,
api_key: str,
user_id: Optional[str] = None # For user-specific keys
)
List API Keys
keys = client.list_api_keys(
user_id: Optional[str] = None
)
Delete API Key
client.delete_api_key(
provider: ApiProvider,
user_id: Optional[str] = None
)
Data Models
ResponseObject
class ResponseObject:
response_id: str # Unique identifier
status: str # Status (processing, succeeded, failed)
state: str # Current state
message: Optional[str] # Progress message
output_text: Optional[str] # Final result
thread_id: Optional[str] # Associated thread
created_at: datetime # Creation timestamp
updated_at: datetime # Last update
progress: List[ProgressEntry] # Progress history
User
class User:
id: str # UUID
email: str # Unique email
first_name: Optional[str]
last_name: Optional[str]
created_at: datetime
updated_at: datetime
ThreadObject
class ThreadObject:
thread_id: str
user_id: str
title: Optional[str]
created_at: datetime
updated_at: datetime
metadata: Optional[Dict]
Enums
Scope
class Scope(Enum):
USER = "USER" # User-level operations
TENANT = "TENANT" # Tenant-level operations
ApiProvider
class ApiProvider(Enum):
OPENAI_API_KEY = "OPENAI_API_KEY"
ANTHROPIC_API_KEY = "ANTHROPIC_API_KEY"
GOOGLE_API_KEY = "GOOGLE_API_KEY"
AZURE_API_KEY = "AZURE_API_KEY"
EXA_API_KEY = "EXA_API_KEY"
# ... more providers
Exceptions
# Base exception
class LumnisAIError(Exception):
"""Base exception with credential sanitization"""
# Authentication
class AuthenticationError(LumnisAIError):
"""Invalid API key or unauthorized"""
# Validation
class ValidationError(LumnisAIError):
"""Invalid request parameters"""
# Scope errors
class MissingUserId(LumnisAIError):
"""user_id required for user scope"""
class TenantScopeUserIdConflict(LumnisAIError):
"""user_id not allowed in tenant scope"""
# Rate limiting
class RateLimitError(LumnisAIError):
"""Rate limit exceeded"""
retry_after: Optional[int]
# Not found
class NotFoundError(LumnisAIError):
"""Resource not found"""
# Transport
class TransportError(LumnisAIError):
"""Network or connection error"""
Examples & Tutorials
1. Building a Chatbot
import lumnisai
from typing import List, Dict
class Chatbot:
def __init__(self, user_id: str):
self.client = lumnisai.Client().for_user(user_id)
self.thread_id = None
def start_conversation(self, title: str = "New Chat"):
thread = self.client.create_thread(title=title)
self.thread_id = thread.thread_id
return thread
def send_message(self, message: str) -> str:
response = self.client.invoke(
message,
thread_id=self.thread_id,
progress=True
)
return response.output_text
def get_history(self) -> List[Dict]:
responses = self.client.list_responses(
thread_id=self.thread_id
)
return responses
# Usage
bot = Chatbot("user-123")
bot.start_conversation("Tech Support")
reply = bot.send_message("How do I reset my password?")
print(reply)
2. Research Assistant
import asyncio
import lumnisai
class ResearchAssistant:
def __init__(self):
self.client = lumnisai.AsyncClient()
async def research_topic(self, topic: str, user_id: str):
print(f"Researching: {topic}")
async for update in await self.client.invoke(
f"Research the topic '{topic}' and provide a comprehensive report with citations",
user_id=user_id,
stream=True
):
if update.state == "processing":
print(f"📊 {update.message}")
elif update.state == "completed":
return update.output_text
async def summarize_documents(self, documents: List[str], user_id: str):
combined = "\n\n".join(documents)
response = await self.client.invoke(
f"Summarize these documents:\n\n{combined}",
user_id=user_id
)
return response.output_text
# Usage
async def main():
assistant = ResearchAssistant()
report = await assistant.research_topic(
"Latest developments in quantum computing",
"user-123"
)
print(report)
asyncio.run(main())
3. Multi-User Application
import lumnisai
from typing import Dict
class MultiUserApp:
def __init__(self):
self.client = lumnisai.Client(scope=lumnisai.Scope.TENANT)
self.user_clients: Dict[str, lumnisai.Client] = {}
def create_user_account(self, email: str, first_name: str, last_name: str):
# Create user in system
user = self.client.create_user(
email=email,
first_name=first_name,
last_name=last_name
)
# Create user-specific client
self.user_clients[user.id] = self.client.for_user(user.id)
return user
def get_user_client(self, user_id: str) -> lumnisai.Client:
if user_id not in self.user_clients:
self.user_clients[user_id] = self.client.for_user(user_id)
return self.user_clients[user_id]
def process_user_request(self, user_id: str, request: str):
user_client = self.get_user_client(user_id)
return user_client.invoke(request)
# Usage
app = MultiUserApp()
# Create users
alice = app.create_user_account("alice@example.com", "Alice", "Smith")
bob = app.create_user_account("bob@example.com", "Bob", "Jones")
# Process requests
alice_response = app.process_user_request(alice.id, "Help me plan a trip to Japan")
bob_response = app.process_user_request(bob.id, "Explain machine learning basics")
4. Error Handling Pattern
import lumnisai
from lumnisai.exceptions import (
AuthenticationError,
RateLimitError,
ValidationError,
NotFoundError
)
import time
def safe_invoke(client, prompt, user_id, max_retries=3):
for attempt in range(max_retries):
try:
return client.invoke(prompt, user_id=user_id)
except AuthenticationError:
print("Invalid API key. Please check your credentials.")
raise
except RateLimitError as e:
if e.retry_after and attempt < max_retries - 1:
print(f"Rate limited. Waiting {e.retry_after} seconds...")
time.sleep(e.retry_after)
continue
raise
except ValidationError as e:
print(f"Invalid request: {e}")
raise
except NotFoundError:
print("User or resource not found")
raise
except Exception as e:
print(f"Unexpected error: {e}")
if attempt < max_retries - 1:
time.sleep(2 ** attempt) # Exponential backoff
continue
raise
# Usage
client = lumnisai.Client()
try:
response = safe_invoke(
client,
"Hello world",
"user-123"
)
print(response.output_text)
except Exception as e:
print(f"Failed after retries: {e}")
5. Progress Callback Example
import asyncio
import lumnisai
from datetime import datetime
async def custom_progress_handler():
client = lumnisai.AsyncClient()
start_time = datetime.now()
async for update in await client.invoke(
"Analyze global climate data and trends",
user_id="user-123",
stream=True
):
elapsed = (datetime.now() - start_time).seconds
print(f"[{elapsed}s] {update.state.upper()}: {update.message}")
# Custom progress bar
if update.state == "processing":
progress_chars = "." * (elapsed % 4)
print(f" Working{progress_chars}", end="\r")
if update.state == "completed":
print(f"\n✅ Completed in {elapsed} seconds")
print(f"Result: {update.output_text[:200]}...")
asyncio.run(custom_progress_handler())
Best Practices
1. Security
- Never hardcode API keys - Use environment variables
- Use user scope by default - Principle of least privilege
- Validate user permissions - Check access before operations
- Sanitize user input - Prevent prompt injection
# Good: Environment variable
client = lumnisai.Client() # Uses LUMNISAI_API_KEY env var
# Bad: Hardcoded key
client = lumnisai.Client(api_key="sk-live-abc123") # Don't do this!
2. Performance
- Reuse clients - Create once, use many times
- Use async for I/O - Better performance for concurrent requests
- Enable streaming - For real-time feedback on long tasks
- Batch operations - Use pagination for large datasets
# Good: Reuse client
class MyApp:
def __init__(self):
self.client = lumnisai.AsyncClient()
async def process_many(self, items):
tasks = [self.client.invoke(item, user_id="123") for item in items]
return await asyncio.gather(*tasks)
# Bad: Create client per request
async def process_item(item):
client = lumnisai.AsyncClient() # Don't create repeatedly
return await client.invoke(item)
3. Error Handling
- Always handle exceptions - Graceful degradation
- Log errors properly - Include context, not credentials
- Implement retries - For transient failures
- Provide user feedback - Clear error messages
import logging
logger = logging.getLogger(__name__)
async def robust_invoke(client, prompt, user_id):
try:
return await client.invoke(prompt, user_id=user_id)
except lumnisai.RateLimitError:
logger.warning(f"Rate limited for user {user_id}")
return {"error": "Too many requests. Please try again later."}
except lumnisai.AuthenticationError:
logger.error("Authentication failed")
return {"error": "Authentication failed. Please contact support."}
except Exception as e:
logger.error(f"Unexpected error: {e}")
return {"error": "An unexpected error occurred."}
4. User Experience
- Show progress - Use
progress=True
or streaming - Set expectations - Inform users about processing time
- Handle timeouts - Implement appropriate timeout values
- Provide feedback - Acknowledge user actions
async def user_friendly_invoke(client, prompt, user_id):
print("Processing your request...")
try:
response = await client.invoke(
prompt,
user_id=user_id,
progress=True,
timeout=60.0 # Reasonable timeout
)
return response.output_text
except asyncio.TimeoutError:
return "The request took too long. Please try a simpler query."
5. Testing
- Mock API calls - Don't hit production in tests
- Test error cases - Ensure graceful handling
- Validate types - Use type checkers
- Test async code - Use proper async test frameworks
# Example test with mocking
import pytest
from unittest.mock import Mock, patch
@pytest.mark.asyncio
async def test_invoke_success():
with patch('lumnisai.AsyncClient') as MockClient:
mock_client = Mock()
mock_response = Mock(output_text="Test response")
mock_client.invoke.return_value = mock_response
MockClient.return_value = mock_client
# Test your code here
client = lumnisai.AsyncClient()
response = await client.invoke("Test", user_id="123")
assert response.output_text == "Test response"
Troubleshooting
Common Issues
1. Authentication Errors
Problem: AuthenticationError: Invalid API key
Solutions:
- Verify API key is correct
- Check environment variable is set
- Ensure API key has proper permissions
- Regenerate key if compromised
# Check environment variable
echo $LUMNISAI_API_KEY
# Set in Python
import os
os.environ['LUMNISAI_API_KEY'] = 'your-key'
2. Missing User ID
Problem: MissingUserId: user_id is required when scope is USER
Solution: Always provide user_id in user scope:
# Wrong
client = lumnisai.Client()
response = client.invoke("Hello") # Missing user_id
# Correct
response = client.invoke("Hello", user_id="user-123")
# Or use user-scoped client
user_client = client.for_user("user-123")
response = user_client.invoke("Hello")
3. Rate Limiting
Problem: RateLimitError: Rate limit exceeded
Solutions:
- Implement exponential backoff
- Use batch operations
- Cache responses when possible
- Request rate limit increase
import time
def handle_rate_limit(func, *args, **kwargs):
max_retries = 5
for i in range(max_retries):
try:
return func(*args, **kwargs)
except lumnisai.RateLimitError as e:
if i < max_retries - 1:
wait_time = e.retry_after or (2 ** i)
time.sleep(wait_time)
else:
raise
4. Connection Issues
Problem: TransportError: Connection failed
Solutions:
- Check internet connection
- Verify base URL is correct
- Check firewall/proxy settings
- Increase timeout value
# Increase timeout
client = lumnisai.Client(timeout=60.0)
# Custom base URL
client = lumnisai.Client(
base_url="https://api.lumnis.ai"
)
5. Async Context Issues
Problem: RuntimeError: This event loop is already running
Solution: Use proper async context:
# In Jupyter notebooks
import nest_asyncio
nest_asyncio.apply()
# Or use sync client
client = lumnisai.Client() # Sync version
response = client.invoke("Hello", user_id="123")
Debug Mode
Enable detailed logging for troubleshooting:
import logging
# Enable debug logging
logging.basicConfig(level=logging.DEBUG)
# Or specific to lumnisai
logging.getLogger('lumnisai').setLevel(logging.DEBUG)
Getting Help
If you encounter issues:
- Check the documentation - This guide and API reference
- Review examples - Working code in
/docs/examples
- Search issues - GitHub Issues
- Contact support - dev@lumnis.ai
When reporting issues, include:
- SDK version (
pip show lumnisai
) - Python version
- Error message and stack trace
- Minimal code to reproduce
Additional Resources
- GitHub Repository: github.com/lumnisai/lumnisai-python
- API Documentation: api.lumnis.ai/docs
- Platform Dashboard: app.lumnis.ai
- Support Email: dev@lumnis.ai
Built with ❤️ by the Lumnis AI team