Skip to main content
Applies to:
  • Plan:
  • Deployment:

Summary

Issue: Creating a prompt via the Python SDK with a custom provider model fails when passing the human-readable model name as the model parameter. Cause: When configuring custom providers in Braintrust, you can use base64-encoded model identifiers. In this case, the SDK requires the exact encoded identifier as configured, rather than the display name shown in the UI. Resolution: Pass the base64-encoded model identifier from your custom provider’s Models configuration as the model value.

Resolution steps

Step 1: Find the encoded model identifier

Navigate to SettingsAI providers → your custom provider → Models. Copy the full base64-encoded model identifier string.

Step 2: Pass the encoded identifier as model

Use the encoded string directly in project.prompts.create:
import braintrust

project = braintrust.projects.create(name="your-project")

project.prompts.create(
    name="my-prompt",
    slug="my-prompt",
    model="<base64-encoded-model-identifier>",  # Use the encoded string from provider config
    messages=[
        {"role": "system", "content": "You are a helpful assistant."}
    ]
)