Supported Providers
Rubric integrates with major LLM providers to evaluate your AI models in production.| Provider | Models | Integration Method |
|---|---|---|
| OpenAI | GPT-4, GPT-4 Turbo, GPT-3.5 | API Key |
| Anthropic | Claude 3, Claude 2 | API Key |
| Azure OpenAI | GPT-4, GPT-3.5 (Azure-hosted) | Azure AD / API Key |
| Google Vertex AI | Gemini Pro, PaLM 2 | Service Account |
| AWS Bedrock | Claude, Titan, Llama 2 | IAM Role |
| Custom Endpoints | Any OpenAI-compatible API | API Key / OAuth |
OpenAI Integration
openai_integration.py
Anthropic Integration
anthropic_integration.py
Azure OpenAI Integration
azure_openai_integration.py
Google Vertex AI Integration
vertex_ai_integration.py
AWS Bedrock Integration
bedrock_integration.py
Custom Endpoints
Connect to any OpenAI-compatible API:custom_endpoint.py
Best Practices
| Practice | Rationale |
|---|---|
| Use environment variables | Never hardcode API keys in source code |
| Rotate keys regularly | Minimize exposure from compromised keys |
| Set up usage alerts | Monitor for unexpected API usage spikes |
| Use separate keys per environment | Isolate production from development |
