Skip to main content

A Compute-First Business Model

The status quo of “pay-per-character” API pricing eats margins as they scale. We are changing it. This is also ideal for enterprise use cases requiring strict data privacy, dedicated throughput, or custom model fine-tuning. MARS 8 Series is available on following providers:
  • Google Cloud Vertex AI
  • AWS Bedrock
  • Hugging face
  • Baseten
  • Modal
  • Cerebrium
  • Simplismart
  • Replicate
For access contact [email protected]

Routing Via Camb AI SDK

The Camb AI SDK directly supports routing Text-to-Speech generation requests to private or above custom deployments of the MARS model.

Baseten Deployment

Initialize the client with your API key and the specific model URL. Baseten Provider Example
from camb.client import CambAI

client = CambAI(
    tts_provider="baseten",
    provider_params={
        "api_key": "YOUR_BASETEN_API_KEY",
        "mars_pro_url": "https://model-xxxxxx.api.baseten.co/environments/production/predict"
    }
)

Usage Example

Passing reference_audio in the additional_body_parameters is required for custom deployments.
import base64
from camb.client import save_stream_to_file

# Prepare audio reference
with open("reference.wav", "rb") as f:
    ref_audio_b64 = base64.b64encode(f.read()).decode("utf-8")

response = client.text_to_speech.tts(
    text="Generating audio from a private Baseten deployment.",
    language="en-us",
    speech_model="mars-pro",
    request_options={
        "additional_body_parameters": {
            "reference_audio": ref_audio_b64,
            "reference_language": "en-us"
        },
        "timeout_in_seconds": 300
    }
)

save_stream_to_file(response, "baseten_output.mp3")

How it Works

  1. The SDK bypasses standard Camb AI API endpoints.
  2. It authenticates directly with Baseten using the provided API Key.
  3. It streams generated audio directly from the model instance.

Vertex AI Deployment

Support for Vertex AI is currently in progress.

Usage Example

client = CambAI(
    tts_provider="vertex",
    provider_params={
        "project_id": "your-gcp-project-id",
        "location": "us-central1"
    }
)